Date   

Re: Verify Jobs in Jenkins being changed to "build Unstable"

Yang Bin
 

I observe the same issue.  I guess this is due to some global setting changes, e.g. global jjb mentioned recently 

Bin

在 2019年3月5日,18:48,Liam Fallon <liam.fallon@...> 写道:

Hi All,

Anyone see Jenkins verify jobs failing this morning?

Symptoms:
- The build runs as normal and the maven jobs all return success
- The build is changed to "BUILD UNSTABLE" at the end of the job

04:38:53 Build step 'Execute Scripts' changed build result to UNSTABLE

For example, See the following jobs:
https://jenkins.onap.org/job/policy-models-master-verify-java/33/
https://jenkins.onap.org/view/sdc/job/sdc-master-release-version-java-daily/567/console
https://jenkins.onap.org/view/so/job/so-master-verify-java/1309/console

Best Regards
Liam


Verify Jobs in Jenkins being changed to "build Unstable"

Liam Fallon
 

Hi All,

Anyone see Jenkins verify jobs failing this morning?

Symptoms:
- The build runs as normal and the maven jobs all return success
- The build is changed to "BUILD UNSTABLE" at the end of the job

04:38:53 Build step 'Execute Scripts' changed build result to UNSTABLE

For example, See the following jobs:
https://jenkins.onap.org/job/policy-models-master-verify-java/33/
https://jenkins.onap.org/view/sdc/job/sdc-master-release-version-java-daily/567/console
https://jenkins.onap.org/view/so/job/so-master-verify-java/1309/console

Best Regards
Liam


Request help on Heat template creation

Ying, Ruoyu
 

Hi,

I’d need to create a heat template for the vIPsec VNF, however, i’ve never written a heat template from stretch, can anyone share some BKM on that?
Thanks.

Best Regards,
Ruoyu


R: [onap-discuss] dmaap-message-router NodePort not recheable

Calamita Agostino
 

Thanks Brian.

 

To use 3.0.1-ONAP can I clone oom release as in onap.readthedocs.io documentation, under “OOM Quick Start Guide”,

 

git clone -b casablanca http://gerrit.onap.org/r/oom

 

or is there another command / repository ?

 

Agos

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: lunedì 4 marzo 2019 15:55
A: Calamita Agostino <agostino.calamita@...>; 'onap-discuss@...' <onap-discuss@...>
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Agos.,

 

Make sure you are using 3.0.1-ONAP (Maintenance release)

 

That being said – the SDC team thinks its the timing issue. DMaaP has to be up (and clean dockerdata-nfs) before SDC.

 

Brian

 

 

 

From: FREEMAN, BRIAN D
Sent: Monday, March 04, 2019 9:31 AM
To: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Subject: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Hmm

 

Those two SDC-BE messages are for clients trying to register so I think that is expected until SDC registers with DMaaP/MR.

Is there  a SDC-BE errror on its communication with DMaaP/MR ?

 

The other thing to try is a redeploy of dev-dmaap (remove dockerdata-nfs/dev-dmaap to clean out the topic registrations)

And dev-sdc (again just to start them fresh)

 

Brian

 

 

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Monday, March 04, 2019 5:21 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

Hi, after redeploy of SDC, healtcheck return “Dmaap: None” in SDC health Check, and Distribution of a Service returns POL5000 error.

 

In SDC-FE error log files I see:

 

2019-03-04T09:52:38.847Z        [qtp215145189-44]       INFO    o.o.sdc.fe.servlets.FeProxyServlet      timer=12        ErrorCategory=INFO      RequestId=null ServiceName=SDC catalog serviceInstanceID=null  ErrorCode=0     uuid=599a9bb3-d3c8-4cea-a926-ea6a11762a63       userId=op0001   localAddr=10.42.236.172        remoteAddr=10.42.98.28  SC="500"

 

 

And in SDC-BE error log file I see these messages:

 

2019-03-04T10:15:21.209Z        [qtp215145189-16]       INFO    o.o.sdc.be.filters.BeServletFilter      AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "clamp" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"       AlertSeverity=0 ElapsedTime=96  EndTimestamp=2019-03-04 10:15:21.208Z  auditOn=false   ServerFQDN=dev-sdc-sdc-be-656bd64b9b-5b89b      StatusCode=ERROR        timer=96        ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter ResponseDescription=Internal Server Error       ResponseCode=500        InstanceUUID=clamp      RequestId=8c56c7a2-da62-4598-963d-b5ed13673fce PartnerName=Apache-HttpClient/4.5.6 (Java/1.8.0_181)    TargetEntity=registerInDistributionEngine       CustomField1=POST: https://sdc-be.onap:8443/sdc/v1/registerForDistribution     CustomField2=500        AuditBeginTimestamp=2019-03-04 10:15:21.112Z    RemoteHost=10.42.216.84        ErrorCategory=INFO      ServerIPAddress=10.42.13.89     ServiceName=/v1/registerForDistribution ErrorCode=0     POST /sdc/v1/registerForDistribution HTTP/1.1 SC="500"

 

2019-03-04T10:15:24.740Z        [qtp215145189-20]       ERROR   o.o.s.c.config.EcompErrorLogUtil        alarmSeverity=MAJOR     AuditBeginTimestamp=2019-03-04 10:15:24.690Z   AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "policy" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"      RequestId=ab5939bb-6780-42e7-b63a-54381b74c352  ErrorCategory=ERROR     ServerIPAddress=10.42.13.89     ServiceName=/v1/registerForDistribution        ErrorCode=500   PartnerName=Apache-HttpClient/4.5.5 (Java/1.8.0_171)    auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-5b89b    TargetEntity=registerInDistributionEngine        Error occured in Distribution Engine. Failed operation: registration validation failed

 

 

Any other check to do ?

 

Thank.

Agos.

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:39
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

I’d do a helm delete dev-sdc –purge

Delete /dockerdata-nfs/dev-so

Configm pv/pvc/pod are gone

Then

 

helm deploy dev-sdc local/onap -f /root/oom/kubernetes/onap/resources/environments/public-cloud.yaml -f /root/integration-override.yaml --namespace onap  --verbose

 

(or whatever your override files are)

 

Looks like SDC came up before dmaap and is confused.

 

There are some less intrusive things to try but you need SDC to Pass Health Check (with DMaaP Up from its perspective)

Basic SDC Health Check                                                (DMaaP:UP)| PASS |

 

 

Brian

 

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:30 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

It works.

 

curl -X POST http://138.132.168.85:30227/events/TEST_TOPIC -H 'cache-control: no-cache'   -H 'content-type: application/json'  -H 'postman-token: 1c679102-85e8-f1a2-e708-3e6d84f8ea06' -d '{ "test": "success",                "timestamp": "1/1/2020" }'

{

    "serverTimeMs": 1,

    "count": 1

 

curl -X GET 'http://138.132.168.85:30227/events/TEST_TOPIC/g1/c3?timeout=5000' -H 'accept: application/json'  -H 'cache-control: no-cache'  -H 'postman-token: 04778117-fd44-0cac-b70c-ef2a2c3024af'                       

["{\"test\":\"success\",\"timestamp\":\"1/1/2020\"}"]

 

Agos.

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:21
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Casablanca.

 

OK

 

Use curl or POSTMAN to write to a TEST_TOPIC (unauthenticated topics are created on demand)

(replace 10.12.5.13 with one of your k8 host IPs) – dont need the postman-token and modify for your environment and preferences etc.

 

curl -X POST \

  http://10.12.5.13:30227/events/TEST_TOPIC \

  -H 'cache-control: no-cache' \

  -H 'content-type: application/json' \

  -H 'postman-token: 1c679102-85e8-f1a2-e708-3e6d84f8ea06' \

  -d '{ "test": "success",

               "timestamp": "1/1/2020"

}'

 

The do a GET

 

curl -X GET \

  'http://10.12.5.13:30227/events/TEST_TOPIC/g1/c3?timeout=5000' \

  -H 'accept: application/json' \

  -H 'cache-control: no-cache' \

  -H 'postman-token: 04778117-fd44-0cac-b70c-ef2a2c3024af'

 

 

You should get the test/timestamp object back on the GET (have to execute the POST/GET twice on the initial topic create)

 

This is to confirm that Mesage Router is internally talking to itself correctly.

 

 

Brian

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:13 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

This is the output:

 

Executing robot tests at log level TRACE

[ ERROR ] Suite 'Testsuites' contains no tests with tag 'healthmr'.

 

Try --help for usage information.

command terminated with exit code 252

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:12
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Please try ./ete-k8s.sh onap healthmr

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:09 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

I didn’t find healthmr test but only health.

(

./ete-k8s.sh onap

Usage: ete-k8s.sh [namespace] [ health | healthdist | distribute | instantiate | instantiateVFWCL | instantiateDemoVFWCL |  | portal ] )

 

The command ./ete-k8s.sh onap health reports the list below (  Basic DMAAP Message Router Health Check = PASS )                            

In my environment there are some PODs not in Running state:

 

dev-aai-aai-data-router-5d55646cdc-cc62v                      1/2       CrashLoopBackOff   1084       4d        10.42.79.203    onapkm3   <none>

dev-appc-appc-ansible-server-76fcf9454d-8km9d                 0/1       CrashLoopBackOff   1656       6d        10.42.212.202   onapkm0   <none>

dev-oof-oof-has-api-585497f5-ktjsv                            0/1       Init:0/3           1085       8d        10.42.86.82     onapkm0   <none>

dev-oof-oof-has-controller-9469b9ff8-td4k9                    0/1       Init:1/3           945        8d        10.42.5.110     onapkm2   <none>

dev-oof-oof-has-data-d559897dc-4lmkt                          0/1       Init:1/4           1091       8d        10.42.199.220   onapkm3   <none>

dev-oof-oof-has-healthcheck-jq9xq                             0/1       Init:0/1           1092       8d        10.42.242.145   onapkm3   <none>

dev-oof-oof-has-reservation-868c7c88ff-pv79n                  0/1       Init:1/4           1081       8d        10.42.176.61    onapkm1   <none>

dev-oof-oof-has-solver-6f8bc6fdf4-tw4cj                       0/1       Init:1/4           1084       8d        10.42.29.154    onapkm0   <none>

dev-sdnc-sdnc-ansible-server-7c76f965c6-hqtzl                 0/1       CrashLoopBackOff   1844       8d        10.42.202.36    onapkm3   <none>

dev-sdnc-sdnc-ueb-listener-6d74459c6-tdqhc                    0/1       CrashLoopBackOff   542        1d        10.42.219.51    onapkm2   <none>

 

and multicloud is not deployed.

 

==============================================================================

Testsuites

==============================================================================

Testsuites.Health-Check :: Testing ecomp components are available via calls.

==============================================================================

Basic A&AI Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic AAF Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic AAF SMS Health Check                                            | PASS |

------------------------------------------------------------------------------

Basic APPC Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic CLI Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic CLAMP Health Check                                              | PASS |

------------------------------------------------------------------------------

Basic DCAE Health Check                                               [ WARN ] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e8102250>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

[ WARN ] Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a8d0>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

[ WARN ] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a850>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

| FAIL |

ConnectionError: HTTPConnectionPool(host='dcae-healthcheck.onap', port=80): Max retries exceeded with url: /healthcheck (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a5d0>: Failed to establish a new connection: [Errno -2] Name or service not known',))

------------------------------------------------------------------------------

Basic DMAAP Data Router Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic DMAAP Message Router Health Check                               | PASS |

------------------------------------------------------------------------------

Basic External API NBI Health Check                                   [ WARN ] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80d6c50>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

[ WARN ] Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80c4e10>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

[ WARN ] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e8199f50>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

| FAIL |

ConnectionError: HTTPConnectionPool(host='nbi.onap', port=8080): Max retries exceeded with url: /nbi/api/v3/status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80e5410>: Failed to establish a new connection: [Errno -2] Name or service not known',))

------------------------------------------------------------------------------

Basic Log Elasticsearch Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic Log Kibana Health Check                                         | PASS |

------------------------------------------------------------------------------

Basic Log Logstash Health Check                                       | PASS |

------------------------------------------------------------------------------

Basic Microservice Bus Health Check                                   | PASS |

------------------------------------------------------------------------------

Basic Multicloud API Health Check                                     | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-ocata API Health Check                               | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-pike API Health Check                                | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-titanium_cloud API Health Check                      | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-vio API Health Check                                 | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic OOF-Homing Health Check                                         | FAIL |

Test timeout 10 seconds exceeded.

------------------------------------------------------------------------------

Basic OOF-SNIRO Health Check                                          | PASS |

------------------------------------------------------------------------------

Basic OOF-CMSO Health Check                                           | PASS |

------------------------------------------------------------------------------

Basic Policy Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic Pomba AAI-context-builder Health Check                          | PASS |

------------------------------------------------------------------------------

Basic Pomba SDC-context-builder Health Check                          | PASS |

------------------------------------------------------------------------------

Basic Pomba Network-discovery-context-builder Health Check            | PASS |

------------------------------------------------------------------------------

Basic Portal Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic SDC Health Check                                                (DMaaP:None)| PASS |

------------------------------------------------------------------------------

Basic SDNC Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic SO Health Check                                                 | PASS |

------------------------------------------------------------------------------

Basic UseCaseUI API Health Check                                      | PASS |

------------------------------------------------------------------------------

Basic VFC catalog API Health Check                                    | PASS |

------------------------------------------------------------------------------

Basic VFC emsdriver API Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic VFC gvnfmdriver API Health Check                                | PASS |

------------------------------------------------------------------------------

Basic VFC huaweivnfmdriver API Health Check                           | PASS |

------------------------------------------------------------------------------

Basic VFC jujuvnfmdriver API Health Check                             | PASS |

------------------------------------------------------------------------------

Basic VFC multivimproxy API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC nokiavnfmdriver API Health Check                            | PASS |

------------------------------------------------------------------------------

Basic VFC nokiav2driver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC nslcm API Health Check                                      | PASS |

------------------------------------------------------------------------------

Basic VFC resmgr API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnflcm API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnfmgr API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnfres API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC workflow API Health Check                                   | PASS |

------------------------------------------------------------------------------

Basic VFC ztesdncdriver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC ztevnfmdriver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VID Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic VNFSDK Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic Holmes Rule Management API Health Check                         | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Holmes Engine Management API Health Check                       | FAIL |

502 != 200

------------------------------------------------------------------------------

Testsuites.Health-Check :: Testing ecomp components are available ... | FAIL |

51 critical tests, 41 passed, 10 failed

51 tests total, 41 passed, 10 failed

==============================================================================

Testsuites                                                            | FAIL |

51 critical tests, 41 passed, 10 failed

51 tests total, 41 passed, 10 failed

==============================================================================

Output:  /share/logs/0001_ete_health/output.xml

Log:     /share/logs/0001_ete_health/log.html

Report:  /share/logs/0001_ete_health/report.html

command terminated with exit code 10

 

 

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 14:49
A: onap-discuss@...; Calamita Agostino <agostino.calamita@...>
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Try a POST to make sure you can write to message router.

I doubt its connectivity.

 

If you are on master branch – try ./ete-k8s.sh onap healthmr to test a write/read to a test topic.

 

(do it twice since the first time it creates a test topic and kafka doesnt forward the message till both the publisher and the subscriber have connected)

 

Brian

 

 

From: onap-discuss@... <onap-discuss@...> On Behalf Of Calamita Agostino
Sent: Friday, March 01, 2019 4:30 AM
To: onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

I tried to execute a wget command from sdc-be POD to message-router REST API and I see that dmaap-message-router is reacheable from sdc-be.

 

This is the result:

 

# kubectl exec -it  dev-sdc-sdc-be-656bd64b9b-jh57x  -n onap -- /bin/bash

 

bash-4.4# wget "http://message-router:3904/topics"

Connecting to message-router:3904 (10.43.1.20:3904)

topics               100% |*******************************|   131   0:00:00 ETA

bash-4.4# cat topics

{"topics": [

    "__consumer_offsets",

    "champRawEvents",

    "SDC-DISTR-NOTIF-TOPIC-AUTO",

    "org.onap.dmaap.mr.PNF_READY"

]}bash-4.4#

 

But audit.log of sdc-be, after “Distribution Service” action from Portal , says:

 

2019-03-01T08:32:07.986Z        [qtp215145189-323354]   INFO    o.o.sdc.be.filters.BeServletFilter     

ResponseCode=500        InstanceUUID=null       RequestId=d2f65e19-b07b-4266-8be2-f170aba42fb1  AlertSeverity=0 ElapsedTime=3  

EndTimestamp=2019-03-01 08:32:07.986Z   PartnerName=op0001      auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      

StatusCode=ERROR        TargetEntity=Distribution Engine is DOWN       

CustomField1=POST: http://sdc-be.onap:8080/sdc2/rest/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate       

timer=3 CustomField2=500        AuditBeginTimestamp=2019-03-01 08:32:07.983Z    RemoteHost=10.42.194.84 ErrorCategory=ERROR    

ServerIPAddress=10.42.179.134   ServiceName=/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate      

ServiceInstanceId=null   ClassName=org.openecomp.sdc.be.filters.BeServletFilter  ResponseDescription=Internal Server Error      

ErrorCode=500   null

 

 

In the same log file I found a lot of messages like this one:

 

2019-03-01T09:21:31.850Z        [qtp215145189-399996]   INFO    o.o.sdc.be.filters.BeServletFilter      AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "aai" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"  ResponseCode=500        InstanceUUID=aai-ml     RequestId=7f01a5b2-ee38-42c9-b7a4-330f020a4134 AlertSeverity=0  ElapsedTime=169 EndTimestamp=2019-03-01 09:21:31.850Z   PartnerName=Apache-HttpClient/4.5.6 (Java/1.8.0_171)    auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      StatusCode=ERROR        TargetEntity=registerInDistributionEngine       CustomField1=POST: https://sdc-be.onap:8443/sdc/v1/registerForDistribution      timer=169       CustomField2=500        AuditBeginTimestamp=2019-03-01 09:21:31.681Z    RemoteHost=10.42.209.109        ErrorCategory=ERROR     ServerIPAddress=10.42.179.134   ServiceName=/v1/registerForDistribution ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter  ResponseDescription=Internal Server Error       ErrorCode=500   ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "aai" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"

 

Thanks.

 

Da: onap-discuss@... [mailto:onap-discuss@...] Per conto di Calamita Agostino
Inviato: giovedì 28 febbraio 2019 16:13
A: onap-discuss@...
Oggetto: [onap-discuss] dmaap-message-router NodePort not recheable

 

Hi all,

I have an issue releted to connectivity from sdc-be pod and dmaap-message-router.

My installation is Casablanca 3.0.0 on 7 kubernetes VM cluster.

 

All dmaap pods are up and running:

 

dev-dmaap-dbc-pg-0                                            1/1       Running            0          1d        10.42.173.158   onapkm5   <none>

dev-dmaap-dbc-pg-1                                            1/1       Running            0          1d        10.42.188.140   onapkm2   <none>

dev-dmaap-dbc-pgpool-7b748d5894-mr2m9                         1/1       Running            0          1d        10.42.237.193   onapkm3   <none>

dev-dmaap-dbc-pgpool-7b748d5894-n6dks                         1/1       Running            0          1d        10.42.192.244   onapkm2   <none>

dev-dmaap-dmaap-bus-controller-6757c4c86-8rq5p                1/1       Running            0          1d        10.42.185.132   onapkm1   <none>

dev-dmaap-dmaap-dr-db-bb4c67cfd-tm7td                         1/1       Running            0          1d        10.42.152.59    onapkm1   <none>

dev-dmaap-dmaap-dr-node-66c8749959-tpdtf                      1/1       Running            0          1d        10.42.216.13    onapkm2   <none>

dev-dmaap-dmaap-dr-prov-5c766b8d69-qzqn2                      1/1       Running            0          1d        10.42.115.247   onapkm6   <none>

dev-dmaap-message-router-fb9f4bc7d-5z52j                      1/1       Running            0          6h        10.42.138.31    onapkm3   <none>

dev-dmaap-message-router-kafka-5fbc897f48-4bpb6               1/1       Running            0          1d        10.42.78.141    onapkm4   <none>

dev-dmaap-message-router-zookeeper-557954854-8d6p9            1/1       Running            0          1d        10.42.169.205   onapkm1   <none>

 

but when I try to distribute a service, from SDC Portal, I got “Internal Server Error”.

 

SDC-BE log file traces:

 

2019-02-28T08:50:35.318Z        [qtp215145189-159837]   INFO    o.o.sdc.be.filters.BeServletFilter      ResponseCode=500       

InstanceUUID=null RequestId=dab0fd50-b06e-4a65-b4a8-7d7edeae3e01   AlertSeverity=0 ElapsedTime=99  EndTimestamp=2019-02-28 08:50:35.318Z PartnerName=op0001      auditOn=true       ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      StatusCode=ERROR       

TargetEntity=Distribution Engine is DOWN       

CustomField1=POST: http://sdc-be.onap:8080/sdc2/rest/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate  

timer=99        CustomField2=500   AuditBeginTimestamp=2019-02-28 08:50:35.219Z    RemoteHost=10.42.194.84 ErrorCategory=ERROR    

ServerIPAddress=10.42.179.134   ServiceName=/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate  

ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter     ResponseDescription=Internal Server Error      

ErrorCode=500   null

 

Also SDC healthcheck reports that U-EB Cluster is DOWN.

 

Inside SDC-BE POD, I tried to make a traceroute to “message-router-zookeeper” and to “message-router”.

 

This is the result ( the first is OK, the second one NOT OK ):

 

bash-4.4# traceroute  message-router-zookeeper

traceroute to message-router-zookeeper (10.42.169.205), 30 hops max, 46 byte packets

1  10.42.7.46 (10.42.7.46)  0.213 ms  0.005 ms  0.005 ms

2  10.42.190.179 (10.42.190.179)  0.194 ms  0.145 ms  0.135 ms

3  10.42.169.205 (10.42.169.205)  0.461 ms  0.160 ms  0.134 ms

 

bash-4.4# traceroute  message-router

traceroute to message-router (10.43.1.20), 30 hops max, 46 byte packets

1  10.42.0.1 (10.42.0.1)  0.009 ms  0.005 ms  0.005 ms

2  itpat1ng505.palermo.italtel.it (138.132.168.173)  0.344 ms  2.211 ms  1.910 ms     ß 138.132.168.X  is VM public network

 3  138.132.169.2 (138.132.169.2)  5.063 ms  3.859 ms  3.934 ms

4  *  *  *

5  *  *  *

6  *  *  *

 

traceroute to message-router-kafka (10.43.148.154), 30 hops max, 46 byte packets

1  10.42.0.1 (10.42.0.1)  0.006 ms  0.005 ms  0.004 ms

2  itpat1ng505.palermo.italtel.it (138.132.168.173)  0.391 ms  0.337 ms  0.314 ms

3  138.132.169.2 (138.132.169.2)  0.803 ms  0.748 ms  0.807 ms

4  *  *  *

5  *  *  *

6  *  *  *

 

It seems that I cannot reach NodePort or ClusterIP inside a POD. This is routing table inside POD:

 

bash-4.4# netstat -rn

Kernel IP routing table

Destination     Gateway         Genmask         Flags   MSS Window  irtt Iface

0.0.0.0         10.42.0.1       0.0.0.0         UG        0 0          0 eth0

10.42.0.0       0.0.0.0         255.255.0.0     U         0 0          0 eth0

 

What can I check on Kubernetes Cluster ?

 

Thanks.

Agostino.

 

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/


Which components are integrated with AAF

Abdelmuhaimen Seaudi
 

Hello,

 

We are studying the possible security model for an ONAP use case, and we found out that the some ONAP components are integrated today with AAF, and some not fully integrated.

 

Where can we find a reference of all the components that are integrated today with AAF in Casablanca ?

 

Thanks

 

Abdelmuhaimen Seaudi

Orange Labs Egypt

Email: abdelmuhaimen.seaudi@...

Mobile: +2012 84644 733

 

_________________________________________________________________________________________________________________________

Ce message et ses pieces jointes peuvent contenir des informations confidentielles ou privilegiees et ne doivent donc
pas etre diffuses, exploites ou copies sans autorisation. Si vous avez recu ce message par erreur, veuillez le signaler
a l'expediteur et le detruire ainsi que les pieces jointes. Les messages electroniques etant susceptibles d'alteration,
Orange decline toute responsabilite si ce message a ete altere, deforme ou falsifie. Merci.

This message and its attachments may contain confidential or privileged information that may be protected by law;
they should not be distributed, used or copied without authorisation.
If you have received this email in error, please notify the sender and delete this message and its attachments.
As emails may be altered, Orange is not liable for messages that have been modified, changed or falsified.
Thank you.


Unstable jjb

Yang Bin
 

Hi,

 

Is anything wrong with Jenkins system? Why are there so many unstable jjb since this morning (~5 hours ago)?

 

Thanks

 

Best Regards,

Bin Yang,    Solution Engineering Team,    Wind River

ONAP Multi-VIM/Cloud PTL

Direct +86,10,84777126    Mobile +86,13811391682    Fax +86,10,64398189

Skype: yangbincs993

 


Re: [ONAP] [Casablanca] procedure for certificater renewal

Ronan Keogh
 

Hi,

We came across the issue of our AAF certificates expiring for the DMaaP DataRouter component. Here is a summary of the steps we took to renew the certificates.

  1. We requested new certificate generation from AAF.
  2. Updates to the datarouter code-base:
    1. AAF certificate updates
    2. Release Note updates
  3. We then followed the steps to release the new artifact version - Release Version of Component:
    1. POM version updates (Steps 1-3).
    2. Integration project updates (Step 5)
    3. OOM project updates (Step 6)
In hindsight I wrote too many JIRA tickets to cover this procedure.
My suggestion would be either:
  • 1 ticket for 2a, 2b & 3a (Component repo updates)
  • 1 ticket for 3b (Integration repo updates)
  • 1 ticket for 3c (OOM repo updates)
or
  • 1 ticket for all with sub-tasks for each item.
Hopefully this is of help and covered everything.

Regards,
Ronan


Re: [onap-tsc] [onap-discuss] M3 template for use cases/functional requirements

Alla Goldner
 

All,

 

I updated the template per discussion we had today https://wiki.onap.org/pages/viewpage.action?pageId=58233064

Please provide your comments.

If no comments received till Thursday, I will consider this template as agreed and we will check the corresponding M3 statuses next week.

 

Best Regards, Alla

 

From: onap-tsc@... <onap-tsc@...> On Behalf Of Alla Goldner
Sent: Monday, March 4, 2019 6:18 PM
To: onap-tsc@...; onap-discuss@...; onap-usecasesub@...
Subject: Re: [onap-tsc] [onap-discuss] M3 template for use cases/functional requirements

 

As discussed today: the template will be simplified and sent out tomorrow the latest.

 

Best regards, Alla

 

Sent from Nine


From: Alex Vul <alex.vul@...>
Sent: Monday, 4 March 2019 18:15
To: onap-discuss@...; Alla Goldner; onap-usecasesub@...
Cc: onap-tsc@...
Subject: Re: [onap-tsc] [onap-discuss] M3 template for use cases/functional requirements

 

Hi,

 

Just to be clear – there is no action on this at this time…

 

From: <onap-discuss@...> on behalf of Alla Goldner <Alla.Goldner@...>
Reply-To: "onap-discuss@..." <onap-discuss@...>, "Alla.Goldner@..." <Alla.Goldner@...>
Date: Tuesday, February 26, 2019 at 9:13 AM
To: "onap-usecasesub@..." <Onap-usecasesub@...>
Cc: "onap-tsc@..." <onap-tsc@...>, onap-discuss <onap-discuss@...>
Subject: [onap-discuss] M3 template for use cases/functional requirements

 

Hi all,

 

Please find attached the template’s draft I’ve created for M3 use cases/functional requirements review.

The motivation behind the included scope – as tests, security etc. would be reported per different projects, the key per use case/functional requirement is to see if the Use case/functional requirement’s corresponding APIs were included to all relevant projects reviews with the Architecture Committee (ARC), and, if not, what is status of discussions.

 

I would like to upload it tomorrow EOD CET, so we can get a reports during the next week Usecase subcommittee meeting.

Hence, please provide your comments and suggestions.

 

Best Regards, Alla

This email and the information contained herein is proprietary and confidential and subject to the Amdocs Email Terms of Service, which you may review at https://www.amdocs.com/about/email-terms-of-service

This email and the information contained herein is proprietary and confidential and subject to the Amdocs Email Terms of Service, which you may review at https://www.amdocs.com/about/email-terms-of-service

This email and the information contained herein is proprietary and confidential and subject to the Amdocs Email Terms of Service, which you may review at https://www.amdocs.com/about/email-terms-of-service


Re: Casablanca APPC Readiness probe failed: APPC is not healthy for more than 2 hours

Vivekanandan Muthukrishnan
 

Hi Taka,

We have 3 node k8s cluster and each one is having 32 vCPU + 64GB RAM. 

APPC was in the same state for a day and i tried bouncing dev-appc-appc-0 & dev-appc-appc-ansible-server. Now, i see that all APPC PODs are up and running.

Not sure is this something to do with network issues in accessing ODL repository or something. Is this expected?

# Here are the image version details

dev-appc-appc-0 Image:                              nexus3.onap.org:10001/onap/appc-image:1.4.4

Workaround

# Today I tried bouncing dev-appc-appc-0 & dev-appc-appc-ansible-server
# Everything came up

$ kubectl get pods -n onap | grep appc
dev-appc-appc-0                                               2/2       Running            0          41m
dev-appc-appc-ansible-server-6877b497df-rfgnj                 1/1       Running            0          1m
dev-appc-appc-cdt-77bccf4847-fmtpw                            1/1       Running            0          1d
dev-appc-appc-db-0                                            1/1       Running            1          1d
dev-appc-appc-db-1                                            1/1       Running            0          1d
dev-appc-appc-db-2                                            1/1       Running            1          1d
dev-appc-appc-dgbuilder-f7565468-fnrz6                        1/1       Running            0          1d

On Mon, Mar 4, 2019 at 7:13 PM CHO, TAKAMUNE <tc012c@...> wrote:

Hi,

 

In Casablanca, Those ODL features were installed when you are installing APPC container. We moved ODL feature installation when APPC container gets built in R4, so it won’t take that long in R4.

 

But 1802 seconds installing is just too long. Did you check your k8s env or VM capacity?

 

Taka

 

From: onap-discuss@... <onap-discuss@...> On Behalf Of Vivekanandan Muthukrishnan
Sent: Sunday, March 3, 2019 6:23 AM
To: onap-discuss@...
Subject: [onap-discuss] Casablanca APPC Readiness probe failed: APPC is not healthy for more than 2 hours

 

Hi All,

 

It seems like APPC POD dev-appc-appc-0 (Container aapc) is taking more time to install KARAF bundles. And dev-appc-appc-ansible-server is keep getting restarted.

 

It this expected ? Are there any workaround to offload KARAF packages from local maven repository?

 

# All APPC PODs

$ kubectl get pods -n onap | grep appc

dev-appc-appc-0                                               1/2       Running            0          2h

dev-appc-appc-ansible-server-6877b497df-j544r                 0/1       Init:0/1           3          37m

dev-appc-appc-cdt-77bccf4847-fmtpw                            1/1       Running            0          2h

dev-appc-appc-db-0                                            1/1       Running            1          2h

dev-appc-appc-db-1                                            1/1       Running            0          2h

dev-appc-appc-db-2                                            1/1       Running            1          2h

dev-appc-appc-dgbuilder-f7565468-fnrz6                        1/1       Running            0          2h

 

 

# It seems like ODL KARAF features are still getting installed and it is taking more time

$ kubectl logs -n onap dev-appc-appc-0 -c appc

Adding feature url mvn:org.onap.appc/onap-appc-design-services/1.4.4/xml/features

Archive:  /opt/onap/appc/features/appc-interfaces-service/appc-interfaces-service-1.4.4.zip

   creating: /opt/opendaylight/system/com/google/code/gson/gson/2.8.0/

   creating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/

   creating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/1.4.4/

   creating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/

   creating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/1.4.4/

   creating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/

   creating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/1.4.4/

  inflating: /opt/opendaylight/system/com/google/code/gson/gson/2.8.0/_remote.repositories  

  inflating: /opt/opendaylight/system/com/google/code/gson/gson/2.8.0/gson-2.8.0.jar  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/1.4.4/appc-interfaces-service-model-1.4.4.jar  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/1.4.4/_remote.repositories  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/maven-metadata-local.xml  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/1.4.4/appc-interfaces-service-bundle-1.4.4.jar  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/1.4.4/_remote.repositories  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/maven-metadata-local.xml  

  inflating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/1.4.4/_remote.repositories  

  inflating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/1.4.4/onap-appc-interfaces-service-1.4.4-features.xml  

  inflating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/maven-metadata-local.xml  

Adding feature url mvn:org.onap.appc/onap-appc-interfaces-service/1.4.4/xml/features

Installing onap-appc-core

Install of onap-appc-core took 1802 seconds

Sleep Finished

Installing onap-appc-metric

 

$ kubectl describe pod -n onap dev-appc-appc-0

Name:           dev-appc-appc-0

Namespace:      onap

Node:           casablanca03/192.168.122.233

Start Time:     Sun, 03 Mar 2019 08:39:03 +0000

Labels:         app=appc

                controller-revision-hash=dev-appc-appc-69d746947b

                release=dev-appc

Annotations:    <none>

Status:         Running

IP:             10.42.135.188

Controlled By:  StatefulSet/dev-appc-appc

Init Containers:

  appc-readiness:

    Container ID:  docker://219c491421ce1c5f83539e84508a642c0726930fe4078a72fbf0de65d17aa236

    Image:         oomk8s/readiness-check:2.0.0

    Image ID:      docker://sha256:867cb038e1d2445a6e5aedc3b5f970dacc8249ab119d6c2e088e10df886ff51f

    Port:          <none>

    Host Port:     <none>

    Command:

      /root/ready.py

    Args:

      --container-name

      appc-db

    State:          Terminated

      Reason:       Completed

      Exit Code:    0

      Started:      Sun, 03 Mar 2019 08:40:03 +0000

      Finished:     Sun, 03 Mar 2019 08:45:25 +0000

    Ready:          True

    Restart Count:  0

    Environment:

      NAMESPACE:  onap (v1:metadata.namespace)

    Mounts:

      /var/run/secrets/kubernetes.io/serviceaccount from default-token-lw9wt (ro)

Containers:

  appc:

    Container ID:  docker://f1c058a8fb540c20beaffcad0349191877527824a0b6e11d55150694c65d6427

    Ports:         8181/TCP, 1830/TCP

    Host Ports:    0/TCP, 0/TCP

    Command:

      /opt/appc/bin/startODL.sh

    State:          Running

      Started:      Sun, 03 Mar 2019 08:48:29 +0000

    Ready:          False

    Restart Count:  0

    Readiness:      exec [/opt/appc/bin/health_check.sh] delay=10s timeout=1s period=10s #success=1 #failure=3

    Environment:

      MYSQL_ROOT_PASSWORD:  <set to the key 'db-root-password' in secret 'dev-appc-appc'>  Optional: false

      SDNC_CONFIG_DIR:      /opt/onap/appc/data/properties

      APPC_CONFIG_DIR:      /opt/onap/appc/data/properties

      DMAAP_TOPIC_ENV:      SUCCESS

      ENABLE_AAF:           true

      ENABLE_ODL_CLUSTER:   false

      APPC_REPLICAS:        1

    Mounts:

      /etc/localtime from localtime (ro)

      /opt/onap/appc/bin/health_check.sh from onap-appc-bin (rw)

      /opt/onap/appc/bin/installAppcDb.sh from onap-appc-bin (rw)

      /opt/onap/appc/bin/startODL.sh from onap-appc-bin (rw)

      /opt/onap/appc/data/properties/aaa-app-config.xml from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/aaiclient.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/appc.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/cadi.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/dblib.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/svclogic.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/svclogic/bin/showActiveGraphs.sh from onap-appc-svclogic-bin (rw)

      /opt/onap/appc/svclogic/config/svclogic.properties from onap-appc-svclogic-config (rw)

      /opt/onap/ccsdk/bin/installSdncDb.sh from onap-sdnc-bin (rw)

      /opt/onap/ccsdk/bin/startODL.sh from onap-sdnc-bin (rw)

      /opt/onap/ccsdk/data/properties/aaiclient.properties from onap-sdnc-data-properties (rw)

      /opt/onap/ccsdk/data/properties/dblib.properties from onap-sdnc-data-properties (rw)

      /opt/onap/ccsdk/data/properties/svclogic.properties from onap-sdnc-data-properties (rw)

      /opt/onap/ccsdk/svclogic/bin/showActiveGraphs.sh from onap-sdnc-svclogic-bin (rw)

      /opt/onap/ccsdk/svclogic/config/svclogic.properties from onap-sdnc-svclogic-config (rw)

      /opt/opendaylight/current/daexim from dev-appc-appc-data (rw)

      /opt/opendaylight/current/etc/org.ops4j.pax.logging.cfg from log-config (rw)

      /var/log/onap from logs (rw)

      /var/run/secrets/kubernetes.io/serviceaccount from default-token-lw9wt (ro)

  filebeat-onap:

    Container ID:   docker://daf7ddc4a3e4945a1a4cab940906022248696e72c90bb15fa01144cacd3a1833

    Image:          docker.elastic.co/beats/filebeat:5.5.0

    Image ID:       docker://sha256:b61327632415b6d374b9f34cea71cb14f9c352e5259140ce6e3c8eaf8becaa1b

    Port:           <none>

    Host Port:      <none>

    State:          Running

      Started:      Sun, 03 Mar 2019 08:48:30 +0000

    Ready:          True

    Restart Count:  0

    Environment:    <none>

    Mounts:

      /usr/share/filebeat/data from data-filebeat (rw)

      /usr/share/filebeat/filebeat.yml from filebeat-conf (rw)

      /var/log/onap from logs (rw)

      /var/run/secrets/kubernetes.io/serviceaccount from default-token-lw9wt (ro)

Conditions:

  Type              Status

  Initialized       True 

  Ready             False 

  ContainersReady   False 

  PodScheduled      True 

Volumes:

  dev-appc-appc-data:

    Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)

    ClaimName:  dev-appc-appc-data-dev-appc-appc-0

    ReadOnly:   false

  localtime:

    Type:          HostPath (bare host directory volume)

    Path:          /etc/localtime

    HostPathType:  

  filebeat-conf:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-filebeat

    Optional:  false

  log-config:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-logging-cfg

    Optional:  false

  logs:

    Type:    EmptyDir (a temporary directory that shares a pod's lifetime)

    Medium:  

  data-filebeat:

    Type:    EmptyDir (a temporary directory that shares a pod's lifetime)

    Medium:  

  onap-appc-data-properties:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-appc-data-properties

    Optional:  false

  onap-appc-svclogic-config:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-appc-svclogic-config

    Optional:  false

  onap-appc-svclogic-bin:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-appc-svclogic-bin

    Optional:  false

  onap-appc-bin:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-appc-bin

    Optional:  false

  onap-sdnc-data-properties:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-sdnc-data-properties

    Optional:  false

  onap-sdnc-svclogic-config:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-sdnc-svclogic-config

    Optional:  false

  onap-sdnc-svclogic-bin:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-sdnc-svclogic-bin

    Optional:  false

  onap-sdnc-bin:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-sdnc-bin

    Optional:  false

  default-token-lw9wt:

    Type:        Secret (a volume populated by a Secret)

    SecretName:  default-token-lw9wt

    Optional:    false

QoS Class:       BestEffort

Node-Selectors:  <none>

Tolerations:     node.kubernetes.io/not-ready:NoExecute for 300s

                 node.kubernetes.io/unreachable:NoExecute for 300s

Events:

  Type     Reason     Age               From                   Message

  ----     ------     ----              ----                   -------

  Warning  Unhealthy  26m (x9 over 1h)  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ ps -e

++ grep startODL

++ wc -l

+ startODL_status=1

++ grep Waiting

++ wc -l

++ /opt/opendaylight/current/bin/client bundle:list

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  20m (x212 over 1h)  kubelet, casablanca03  (combined from similar events): Readiness probe failed: APPC is not healthy.

++ ps -e

++ grep startODL

++ wc -l

+ startODL_status=1

++ grep Waiting

++ /opt/opendaylight/current/bin/client bundle:list

++ wc -l

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  16m  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ wc -l

++ ps -e

++ grep startODL

+ startODL_status=1

++ /opt/opendaylight/current/bin/client bundle:list

++ grep Waiting

++ wc -l

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  11m (x203 over 1h)  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ ps -e

++ grep startODL

++ wc -l

+ startODL_status=1

++ /opt/opendaylight/current/bin/client bundle:list

++ grep Waiting

++ wc -l

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  5m (x12 over 1h)  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ ps -e

++ wc -l

++ grep startODL

+ startODL_status=1

++ /opt/opendaylight/current/bin/client bundle:list

++ wc -l

++ grep Waiting

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  1m (x31 over 1h)  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ ps -e

++ wc -l

++ grep startODL

+ startODL_status=1

++ /opt/opendaylight/current/bin/client bundle:list

++ grep Waiting

++ wc -l

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

 

 

=== We can access the karaf

 

$ kubectl exec -it -n onap dev-appc-appc-0 -c appc -- /bin/bash

root@dev-appc-appc-0:/# ls

bin   dev  home  lib64  mnt  proc  run   srv  tmp  var

boot  etc  lib   media  opt  root  sbin  sys  usr

root@dev-appc-appc-0:/# ps -ef | grep java

root       217   156 15 08:48 ?        00:21:44 /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Djava.security.properties=/opt/opendaylight/etc/odl.java.security -Xms128M -Xmx2048m -XX:+UnlockDiagnosticVMOptions -XX:+HeapDumpOnOutOfMemoryError -Dcom.sun.management.jmxremote -Djava.security.egd=file:/dev/./urandom -Djava.endorsed.dirs=/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/endorsed:/usr/lib/jvm/java-8-openjdk-amd64/lib/endorsed:/opt/opendaylight/lib/endorsed -Djava.ext.dirs=/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/ext:/usr/lib/jvm/java-8-openjdk-amd64/lib/ext:/opt/opendaylight/lib/ext -Dkaraf.instances=/opt/opendaylight/instances -Dkaraf.home=/opt/opendaylight -Dkaraf.base=/opt/opendaylight -Dkaraf.data=/opt/opendaylight/data -Dkaraf.etc=/opt/opendaylight/etc -Dkaraf.restart.jvm.supported=true -Djava.io.tmpdir=/opt/opendaylight/data/tmp -Djava.util.logging.config.file=/opt/opendaylight/etc/java.util.logging.properties -Dkaraf.startLocalConsole=false -Dkaraf.startRemoteShell=true -classpath /opt/opendaylight/lib/boot/org.apache.karaf.diagnostic.boot-4.1.5.jar:/opt/opendaylight/lib/boot/org.apache.karaf.jaas.boot-4.1.5.jar:/opt/opendaylight/lib/boot/org.apache.karaf.main-4.1.5.jar:/opt/opendaylight/lib/boot/org.osgi.core-6.0.0.jar org.apache.karaf.main.Main

root     10908  1660  0 10:53 ?        00:00:04 /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Dkaraf.instances=/opt/opendaylight/instances -Dkaraf.home=/opt/opendaylight -Dkaraf.base=/opt/opendaylight -Dkaraf.etc=/opt/opendaylight/etc -Djava.io.tmpdir=/opt/opendaylight/data/tmp -Djava.util.logging.config.file=/opt/opendaylight/etc/java.util.logging.properties -classpath /opt/opendaylight/system/org/apache/karaf/org.apache.karaf.client/4.1.5/org.apache.karaf.client-4.1.5.jar:/opt/opendaylight/system/org/apache/sshd/sshd-core/1.6.0/sshd-core-1.6.0.jar:/opt/opendaylight/system/org/fusesource/jansi/jansi/1.17/jansi-1.17.jar:/opt/opendaylight/system/org/jline/jline/3.6.0/jline-3.6.0.jar:/opt/opendaylight/system/org/slf4j/slf4j-api/1.7.12/slf4j-api-1.7.12.jar org.apache.karaf.client.Main feature:install -r onap-appc-metric

root     25586 25509  0 11:10 ?        00:00:00 /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Dkaraf.instances=/opt/opendaylight/current/instances -Dkaraf.home=/opt/opendaylight/current -Dkaraf.base=/opt/opendaylight/current -Dkaraf.etc=/opt/opendaylight/current/etc -Djava.io.tmpdir=/opt/opendaylight/current/data/tmp -Djava.util.logging.config.file=/opt/opendaylight/current/etc/java.util.logging.properties -classpath /opt/opendaylight/current/system/org/apache/karaf/org.apache.karaf.client/4.1.5/org.apache.karaf.client-4.1.5.jar:/opt/opendaylight/current/system/org/apache/sshd/sshd-core/1.6.0/sshd-core-1.6.0.jar:/opt/opendaylight/current/system/org/fusesource/jansi/jansi/1.17/jansi-1.17.jar:/opt/opendaylight/current/system/org/jline/jline/3.6.0/jline-3.6.0.jar:/opt/opendaylight/current/system/org/slf4j/slf4j-api/1.7.12/slf4j-api-1.7.12.jar org.apache.karaf.client.Main system:start-level

root     25637 25428  0 11:10 ?        00:00:00 grep --color=auto java

 

 

 

 

 

 

 

 


[ONAP] [Casablanca] procedure for certificater renewal

Morgan Richomme
 

Hi,

some end users of the Orange openlab (Casablanca) reported the recent expiration of some certificates (VID) on the current Casablanca solution.

The certificate expired on 28 février 2019 à 12:22. The current time is 4 mars 2019 à 16:54.

I had a look at the ONAP documentation (https://docs.onap.org/en/casablanca/) and on the wiki to see if any procedure was indicated to manage an extension of the certificates for the team in charge of the operations on the platform (assuming that an expiration for Casablanca end of February sounds a bit strange - we reinstalled the ~ maintenance release mid of February - using casablanca branch).

I may have missed something in the documentation, but I did not find a chapter addressing this topic.
All the operations are related to OOM (quick start, user guide). 
I did not see anything for certificates update/renewal.

I saw the AAF patchset https://gerrit.onap.org/r/#/c/79211/ related to dmaap-datarouter (no merged).
Shall we upgrade the version of the expired pods components?
Is there any list to plan the operations?
Any information will be welcome.

/Morgan




_________________________________________________________________________________________________________________________

Ce message et ses pieces jointes peuvent contenir des informations confidentielles ou privilegiees et ne doivent donc
pas etre diffuses, exploites ou copies sans autorisation. Si vous avez recu ce message par erreur, veuillez le signaler
a l'expediteur et le detruire ainsi que les pieces jointes. Les messages electroniques etant susceptibles d'alteration,
Orange decline toute responsabilite si ce message a ete altere, deforme ou falsifie. Merci.

This message and its attachments may contain confidential or privileged information that may be protected by law;
they should not be distributed, used or copied without authorisation.
If you have received this email in error, please notify the sender and delete this message and its attachments.
As emails may be altered, Orange is not liable for messages that have been modified, changed or falsified.
Thank you.


[policy][clamp] Joint project meeting 3/6/2019 - UPDATED BRIDGE

Pamela Dragosh
 

FYI the combined policy/clamp weekly project meeting this Wednesday has an updated bridge since Martial is OOO:

 

 https://zoom.us/j/455740750

 

https://wiki.onap.org/pages/viewpage.action?pageId=59966027

 

thanks!

 

Pam Dragosh

Policy PTL


Re: [onap-tsc] [onap-discuss] M3 template for use cases/functional requirements

Alla Goldner
 

As discussed today: the template will be simplified and sent out tomorrow the latest.


Best regards, Alla


Sent from Nine


From: Alex Vul <alex.vul@...>
Sent: Monday, 4 March 2019 18:15
To: onap-discuss@...; Alla Goldner; onap-usecasesub@...
Cc: onap-tsc@...
Subject: Re: [onap-tsc] [onap-discuss] M3 template for use cases/functional requirements

Hi,

 

Just to be clear – there is no action on this at this time…

 

From: <onap-discuss@...> on behalf of Alla Goldner <Alla.Goldner@...>
Reply-To: "onap-discuss@..." <onap-discuss@...>, "Alla.Goldner@..." <Alla.Goldner@...>
Date: Tuesday, February 26, 2019 at 9:13 AM
To: "onap-usecasesub@..." <Onap-usecasesub@...>
Cc: "onap-tsc@..." <onap-tsc@...>, onap-discuss <onap-discuss@...>
Subject: [onap-discuss] M3 template for use cases/functional requirements

 

Hi all,

 

Please find attached the template’s draft I’ve created for M3 use cases/functional requirements review.

The motivation behind the included scope – as tests, security etc. would be reported per different projects, the key per use case/functional requirement is to see if the Use case/functional requirement’s corresponding APIs were included to all relevant projects reviews with the Architecture Committee (ARC), and, if not, what is status of discussions.

 

I would like to upload it tomorrow EOD CET, so we can get a reports during the next week Usecase subcommittee meeting.

Hence, please provide your comments and suggestions.

 

Best Regards, Alla

This email and the information contained herein is proprietary and confidential and subject to the Amdocs Email Terms of Service, which you may review at https://www.amdocs.com/about/email-terms-of-service

This email and the information contained herein is proprietary and confidential and subject to the Amdocs Email Terms of Service, which you may review at https://www.amdocs.com/about/email-terms-of-service


Re: M3 template for use cases/functional requirements

Alex Vul <alex.vul@...>
 

Hi,

 

Just to be clear – there is no action on this at this time…

 

From: <onap-discuss@...> on behalf of Alla Goldner <Alla.Goldner@...>
Reply-To: "onap-discuss@..." <onap-discuss@...>, "Alla.Goldner@..." <Alla.Goldner@...>
Date: Tuesday, February 26, 2019 at 9:13 AM
To: "onap-usecasesub@..." <Onap-usecasesub@...>
Cc: "onap-tsc@..." <onap-tsc@...>, onap-discuss <onap-discuss@...>
Subject: [onap-discuss] M3 template for use cases/functional requirements

 

Hi all,

 

Please find attached the template’s draft I’ve created for M3 use cases/functional requirements review.

The motivation behind the included scope – as tests, security etc. would be reported per different projects, the key per use case/functional requirement is to see if the Use case/functional requirement’s corresponding APIs were included to all relevant projects reviews with the Architecture Committee (ARC), and, if not, what is status of discussions.

 

I would like to upload it tomorrow EOD CET, so we can get a reports during the next week Usecase subcommittee meeting.

Hence, please provide your comments and suggestions.

 

Best Regards, Alla

This email and the information contained herein is proprietary and confidential and subject to the Amdocs Email Terms of Service, which you may review at https://www.amdocs.com/about/email-terms-of-service


Re: dmaap-message-router NodePort not recheable

Brian Freeman
 

Agos.,

 

Make sure you are using 3.0.1-ONAP (Maintenance release)

 

That being said – the SDC team thinks its the timing issue. DMaaP has to be up (and clean dockerdata-nfs) before SDC.

 

Brian

 

 

 

From: FREEMAN, BRIAN D
Sent: Monday, March 04, 2019 9:31 AM
To: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Subject: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Hmm

 

Those two SDC-BE messages are for clients trying to register so I think that is expected until SDC registers with DMaaP/MR.

Is there  a SDC-BE errror on its communication with DMaaP/MR ?

 

The other thing to try is a redeploy of dev-dmaap (remove dockerdata-nfs/dev-dmaap to clean out the topic registrations)

And dev-sdc (again just to start them fresh)

 

Brian

 

 

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Monday, March 04, 2019 5:21 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

Hi, after redeploy of SDC, healtcheck return “Dmaap: None” in SDC health Check, and Distribution of a Service returns POL5000 error.

 

In SDC-FE error log files I see:

 

2019-03-04T09:52:38.847Z        [qtp215145189-44]       INFO    o.o.sdc.fe.servlets.FeProxyServlet      timer=12        ErrorCategory=INFO      RequestId=null ServiceName=SDC catalog serviceInstanceID=null  ErrorCode=0     uuid=599a9bb3-d3c8-4cea-a926-ea6a11762a63       userId=op0001   localAddr=10.42.236.172        remoteAddr=10.42.98.28  SC="500"

 

 

And in SDC-BE error log file I see these messages:

 

2019-03-04T10:15:21.209Z        [qtp215145189-16]       INFO    o.o.sdc.be.filters.BeServletFilter      AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "clamp" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"       AlertSeverity=0 ElapsedTime=96  EndTimestamp=2019-03-04 10:15:21.208Z  auditOn=false   ServerFQDN=dev-sdc-sdc-be-656bd64b9b-5b89b      StatusCode=ERROR        timer=96        ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter ResponseDescription=Internal Server Error       ResponseCode=500        InstanceUUID=clamp      RequestId=8c56c7a2-da62-4598-963d-b5ed13673fce PartnerName=Apache-HttpClient/4.5.6 (Java/1.8.0_181)    TargetEntity=registerInDistributionEngine       CustomField1=POST: https://sdc-be.onap:8443/sdc/v1/registerForDistribution     CustomField2=500        AuditBeginTimestamp=2019-03-04 10:15:21.112Z    RemoteHost=10.42.216.84        ErrorCategory=INFO      ServerIPAddress=10.42.13.89     ServiceName=/v1/registerForDistribution ErrorCode=0     POST /sdc/v1/registerForDistribution HTTP/1.1 SC="500"

 

2019-03-04T10:15:24.740Z        [qtp215145189-20]       ERROR   o.o.s.c.config.EcompErrorLogUtil        alarmSeverity=MAJOR     AuditBeginTimestamp=2019-03-04 10:15:24.690Z   AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "policy" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"      RequestId=ab5939bb-6780-42e7-b63a-54381b74c352  ErrorCategory=ERROR     ServerIPAddress=10.42.13.89     ServiceName=/v1/registerForDistribution        ErrorCode=500   PartnerName=Apache-HttpClient/4.5.5 (Java/1.8.0_171)    auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-5b89b    TargetEntity=registerInDistributionEngine        Error occured in Distribution Engine. Failed operation: registration validation failed

 

 

Any other check to do ?

 

Thank.

Agos.

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:39
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

I’d do a helm delete dev-sdc –purge

Delete /dockerdata-nfs/dev-so

Configm pv/pvc/pod are gone

Then

 

helm deploy dev-sdc local/onap -f /root/oom/kubernetes/onap/resources/environments/public-cloud.yaml -f /root/integration-override.yaml --namespace onap  --verbose

 

(or whatever your override files are)

 

Looks like SDC came up before dmaap and is confused.

 

There are some less intrusive things to try but you need SDC to Pass Health Check (with DMaaP Up from its perspective)

Basic SDC Health Check                                                (DMaaP:UP)| PASS |

 

 

Brian

 

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:30 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

It works.

 

curl -X POST http://138.132.168.85:30227/events/TEST_TOPIC -H 'cache-control: no-cache'   -H 'content-type: application/json'  -H 'postman-token: 1c679102-85e8-f1a2-e708-3e6d84f8ea06' -d '{ "test": "success",                "timestamp": "1/1/2020" }'

{

    "serverTimeMs": 1,

    "count": 1

 

curl -X GET 'http://138.132.168.85:30227/events/TEST_TOPIC/g1/c3?timeout=5000' -H 'accept: application/json'  -H 'cache-control: no-cache'  -H 'postman-token: 04778117-fd44-0cac-b70c-ef2a2c3024af'                       

["{\"test\":\"success\",\"timestamp\":\"1/1/2020\"}"]

 

Agos.

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:21
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Casablanca.

 

OK

 

Use curl or POSTMAN to write to a TEST_TOPIC (unauthenticated topics are created on demand)

(replace 10.12.5.13 with one of your k8 host IPs) – dont need the postman-token and modify for your environment and preferences etc.

 

curl -X POST \

  http://10.12.5.13:30227/events/TEST_TOPIC \

  -H 'cache-control: no-cache' \

  -H 'content-type: application/json' \

  -H 'postman-token: 1c679102-85e8-f1a2-e708-3e6d84f8ea06' \

  -d '{ "test": "success",

               "timestamp": "1/1/2020"

}'

 

The do a GET

 

curl -X GET \

  'http://10.12.5.13:30227/events/TEST_TOPIC/g1/c3?timeout=5000' \

  -H 'accept: application/json' \

  -H 'cache-control: no-cache' \

  -H 'postman-token: 04778117-fd44-0cac-b70c-ef2a2c3024af'

 

 

You should get the test/timestamp object back on the GET (have to execute the POST/GET twice on the initial topic create)

 

This is to confirm that Mesage Router is internally talking to itself correctly.

 

 

Brian

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:13 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

This is the output:

 

Executing robot tests at log level TRACE

[ ERROR ] Suite 'Testsuites' contains no tests with tag 'healthmr'.

 

Try --help for usage information.

command terminated with exit code 252

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:12
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Please try ./ete-k8s.sh onap healthmr

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:09 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

I didn’t find healthmr test but only health.

(

./ete-k8s.sh onap

Usage: ete-k8s.sh [namespace] [ health | healthdist | distribute | instantiate | instantiateVFWCL | instantiateDemoVFWCL |  | portal ] )

 

The command ./ete-k8s.sh onap health reports the list below (  Basic DMAAP Message Router Health Check = PASS )                            

In my environment there are some PODs not in Running state:

 

dev-aai-aai-data-router-5d55646cdc-cc62v                      1/2       CrashLoopBackOff   1084       4d        10.42.79.203    onapkm3   <none>

dev-appc-appc-ansible-server-76fcf9454d-8km9d                 0/1       CrashLoopBackOff   1656       6d        10.42.212.202   onapkm0   <none>

dev-oof-oof-has-api-585497f5-ktjsv                            0/1       Init:0/3           1085       8d        10.42.86.82     onapkm0   <none>

dev-oof-oof-has-controller-9469b9ff8-td4k9                    0/1       Init:1/3           945        8d        10.42.5.110     onapkm2   <none>

dev-oof-oof-has-data-d559897dc-4lmkt                          0/1       Init:1/4           1091       8d        10.42.199.220   onapkm3   <none>

dev-oof-oof-has-healthcheck-jq9xq                             0/1       Init:0/1           1092       8d        10.42.242.145   onapkm3   <none>

dev-oof-oof-has-reservation-868c7c88ff-pv79n                  0/1       Init:1/4           1081       8d        10.42.176.61    onapkm1   <none>

dev-oof-oof-has-solver-6f8bc6fdf4-tw4cj                       0/1       Init:1/4           1084       8d        10.42.29.154    onapkm0   <none>

dev-sdnc-sdnc-ansible-server-7c76f965c6-hqtzl                 0/1       CrashLoopBackOff   1844       8d        10.42.202.36    onapkm3   <none>

dev-sdnc-sdnc-ueb-listener-6d74459c6-tdqhc                    0/1       CrashLoopBackOff   542        1d        10.42.219.51    onapkm2   <none>

 

and multicloud is not deployed.

 

==============================================================================

Testsuites

==============================================================================

Testsuites.Health-Check :: Testing ecomp components are available via calls.

==============================================================================

Basic A&AI Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic AAF Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic AAF SMS Health Check                                            | PASS |

------------------------------------------------------------------------------

Basic APPC Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic CLI Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic CLAMP Health Check                                              | PASS |

------------------------------------------------------------------------------

Basic DCAE Health Check                                               [ WARN ] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e8102250>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

[ WARN ] Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a8d0>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

[ WARN ] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a850>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

| FAIL |

ConnectionError: HTTPConnectionPool(host='dcae-healthcheck.onap', port=80): Max retries exceeded with url: /healthcheck (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a5d0>: Failed to establish a new connection: [Errno -2] Name or service not known',))

------------------------------------------------------------------------------

Basic DMAAP Data Router Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic DMAAP Message Router Health Check                               | PASS |

------------------------------------------------------------------------------

Basic External API NBI Health Check                                   [ WARN ] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80d6c50>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

[ WARN ] Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80c4e10>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

[ WARN ] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e8199f50>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

| FAIL |

ConnectionError: HTTPConnectionPool(host='nbi.onap', port=8080): Max retries exceeded with url: /nbi/api/v3/status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80e5410>: Failed to establish a new connection: [Errno -2] Name or service not known',))

------------------------------------------------------------------------------

Basic Log Elasticsearch Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic Log Kibana Health Check                                         | PASS |

------------------------------------------------------------------------------

Basic Log Logstash Health Check                                       | PASS |

------------------------------------------------------------------------------

Basic Microservice Bus Health Check                                   | PASS |

------------------------------------------------------------------------------

Basic Multicloud API Health Check                                     | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-ocata API Health Check                               | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-pike API Health Check                                | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-titanium_cloud API Health Check                      | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-vio API Health Check                                 | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic OOF-Homing Health Check                                         | FAIL |

Test timeout 10 seconds exceeded.

------------------------------------------------------------------------------

Basic OOF-SNIRO Health Check                                          | PASS |

------------------------------------------------------------------------------

Basic OOF-CMSO Health Check                                           | PASS |

------------------------------------------------------------------------------

Basic Policy Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic Pomba AAI-context-builder Health Check                          | PASS |

------------------------------------------------------------------------------

Basic Pomba SDC-context-builder Health Check                          | PASS |

------------------------------------------------------------------------------

Basic Pomba Network-discovery-context-builder Health Check            | PASS |

------------------------------------------------------------------------------

Basic Portal Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic SDC Health Check                                                (DMaaP:None)| PASS |

------------------------------------------------------------------------------

Basic SDNC Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic SO Health Check                                                 | PASS |

------------------------------------------------------------------------------

Basic UseCaseUI API Health Check                                      | PASS |

------------------------------------------------------------------------------

Basic VFC catalog API Health Check                                    | PASS |

------------------------------------------------------------------------------

Basic VFC emsdriver API Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic VFC gvnfmdriver API Health Check                                | PASS |

------------------------------------------------------------------------------

Basic VFC huaweivnfmdriver API Health Check                           | PASS |

------------------------------------------------------------------------------

Basic VFC jujuvnfmdriver API Health Check                             | PASS |

------------------------------------------------------------------------------

Basic VFC multivimproxy API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC nokiavnfmdriver API Health Check                            | PASS |

------------------------------------------------------------------------------

Basic VFC nokiav2driver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC nslcm API Health Check                                      | PASS |

------------------------------------------------------------------------------

Basic VFC resmgr API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnflcm API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnfmgr API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnfres API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC workflow API Health Check                                   | PASS |

------------------------------------------------------------------------------

Basic VFC ztesdncdriver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC ztevnfmdriver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VID Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic VNFSDK Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic Holmes Rule Management API Health Check                         | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Holmes Engine Management API Health Check                       | FAIL |

502 != 200

------------------------------------------------------------------------------

Testsuites.Health-Check :: Testing ecomp components are available ... | FAIL |

51 critical tests, 41 passed, 10 failed

51 tests total, 41 passed, 10 failed

==============================================================================

Testsuites                                                            | FAIL |

51 critical tests, 41 passed, 10 failed

51 tests total, 41 passed, 10 failed

==============================================================================

Output:  /share/logs/0001_ete_health/output.xml

Log:     /share/logs/0001_ete_health/log.html

Report:  /share/logs/0001_ete_health/report.html

command terminated with exit code 10

 

 

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 14:49
A: onap-discuss@...; Calamita Agostino <agostino.calamita@...>
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Try a POST to make sure you can write to message router.

I doubt its connectivity.

 

If you are on master branch – try ./ete-k8s.sh onap healthmr to test a write/read to a test topic.

 

(do it twice since the first time it creates a test topic and kafka doesnt forward the message till both the publisher and the subscriber have connected)

 

Brian

 

 

From: onap-discuss@... <onap-discuss@...> On Behalf Of Calamita Agostino
Sent: Friday, March 01, 2019 4:30 AM
To: onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

I tried to execute a wget command from sdc-be POD to message-router REST API and I see that dmaap-message-router is reacheable from sdc-be.

 

This is the result:

 

# kubectl exec -it  dev-sdc-sdc-be-656bd64b9b-jh57x  -n onap -- /bin/bash

 

bash-4.4# wget "http://message-router:3904/topics"

Connecting to message-router:3904 (10.43.1.20:3904)

topics               100% |*******************************|   131   0:00:00 ETA

bash-4.4# cat topics

{"topics": [

    "__consumer_offsets",

    "champRawEvents",

    "SDC-DISTR-NOTIF-TOPIC-AUTO",

    "org.onap.dmaap.mr.PNF_READY"

]}bash-4.4#

 

But audit.log of sdc-be, after “Distribution Service” action from Portal , says:

 

2019-03-01T08:32:07.986Z        [qtp215145189-323354]   INFO    o.o.sdc.be.filters.BeServletFilter     

ResponseCode=500        InstanceUUID=null       RequestId=d2f65e19-b07b-4266-8be2-f170aba42fb1  AlertSeverity=0 ElapsedTime=3  

EndTimestamp=2019-03-01 08:32:07.986Z   PartnerName=op0001      auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      

StatusCode=ERROR        TargetEntity=Distribution Engine is DOWN       

CustomField1=POST: http://sdc-be.onap:8080/sdc2/rest/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate       

timer=3 CustomField2=500        AuditBeginTimestamp=2019-03-01 08:32:07.983Z    RemoteHost=10.42.194.84 ErrorCategory=ERROR    

ServerIPAddress=10.42.179.134   ServiceName=/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate      

ServiceInstanceId=null   ClassName=org.openecomp.sdc.be.filters.BeServletFilter  ResponseDescription=Internal Server Error      

ErrorCode=500   null

 

 

In the same log file I found a lot of messages like this one:

 

2019-03-01T09:21:31.850Z        [qtp215145189-399996]   INFO    o.o.sdc.be.filters.BeServletFilter      AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "aai" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"  ResponseCode=500        InstanceUUID=aai-ml     RequestId=7f01a5b2-ee38-42c9-b7a4-330f020a4134 AlertSeverity=0  ElapsedTime=169 EndTimestamp=2019-03-01 09:21:31.850Z   PartnerName=Apache-HttpClient/4.5.6 (Java/1.8.0_171)    auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      StatusCode=ERROR        TargetEntity=registerInDistributionEngine       CustomField1=POST: https://sdc-be.onap:8443/sdc/v1/registerForDistribution      timer=169       CustomField2=500        AuditBeginTimestamp=2019-03-01 09:21:31.681Z    RemoteHost=10.42.209.109        ErrorCategory=ERROR     ServerIPAddress=10.42.179.134   ServiceName=/v1/registerForDistribution ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter  ResponseDescription=Internal Server Error       ErrorCode=500   ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "aai" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"

 

Thanks.

 

Da: onap-discuss@... [mailto:onap-discuss@...] Per conto di Calamita Agostino
Inviato: giovedì 28 febbraio 2019 16:13
A: onap-discuss@...
Oggetto: [onap-discuss] dmaap-message-router NodePort not recheable

 

Hi all,

I have an issue releted to connectivity from sdc-be pod and dmaap-message-router.

My installation is Casablanca 3.0.0 on 7 kubernetes VM cluster.

 

All dmaap pods are up and running:

 

dev-dmaap-dbc-pg-0                                            1/1       Running            0          1d        10.42.173.158   onapkm5   <none>

dev-dmaap-dbc-pg-1                                            1/1       Running            0          1d        10.42.188.140   onapkm2   <none>

dev-dmaap-dbc-pgpool-7b748d5894-mr2m9                         1/1       Running            0          1d        10.42.237.193   onapkm3   <none>

dev-dmaap-dbc-pgpool-7b748d5894-n6dks                         1/1       Running            0          1d        10.42.192.244   onapkm2   <none>

dev-dmaap-dmaap-bus-controller-6757c4c86-8rq5p                1/1       Running            0          1d        10.42.185.132   onapkm1   <none>

dev-dmaap-dmaap-dr-db-bb4c67cfd-tm7td                         1/1       Running            0          1d        10.42.152.59    onapkm1   <none>

dev-dmaap-dmaap-dr-node-66c8749959-tpdtf                      1/1       Running            0          1d        10.42.216.13    onapkm2   <none>

dev-dmaap-dmaap-dr-prov-5c766b8d69-qzqn2                      1/1       Running            0          1d        10.42.115.247   onapkm6   <none>

dev-dmaap-message-router-fb9f4bc7d-5z52j                      1/1       Running            0          6h        10.42.138.31    onapkm3   <none>

dev-dmaap-message-router-kafka-5fbc897f48-4bpb6               1/1       Running            0          1d        10.42.78.141    onapkm4   <none>

dev-dmaap-message-router-zookeeper-557954854-8d6p9            1/1       Running            0          1d        10.42.169.205   onapkm1   <none>

 

but when I try to distribute a service, from SDC Portal, I got “Internal Server Error”.

 

SDC-BE log file traces:

 

2019-02-28T08:50:35.318Z        [qtp215145189-159837]   INFO    o.o.sdc.be.filters.BeServletFilter      ResponseCode=500       

InstanceUUID=null RequestId=dab0fd50-b06e-4a65-b4a8-7d7edeae3e01   AlertSeverity=0 ElapsedTime=99  EndTimestamp=2019-02-28 08:50:35.318Z PartnerName=op0001      auditOn=true       ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      StatusCode=ERROR       

TargetEntity=Distribution Engine is DOWN       

CustomField1=POST: http://sdc-be.onap:8080/sdc2/rest/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate  

timer=99        CustomField2=500   AuditBeginTimestamp=2019-02-28 08:50:35.219Z    RemoteHost=10.42.194.84 ErrorCategory=ERROR    

ServerIPAddress=10.42.179.134   ServiceName=/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate  

ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter     ResponseDescription=Internal Server Error      

ErrorCode=500   null

 

Also SDC healthcheck reports that U-EB Cluster is DOWN.

 

Inside SDC-BE POD, I tried to make a traceroute to “message-router-zookeeper” and to “message-router”.

 

This is the result ( the first is OK, the second one NOT OK ):

 

bash-4.4# traceroute  message-router-zookeeper

traceroute to message-router-zookeeper (10.42.169.205), 30 hops max, 46 byte packets

1  10.42.7.46 (10.42.7.46)  0.213 ms  0.005 ms  0.005 ms

2  10.42.190.179 (10.42.190.179)  0.194 ms  0.145 ms  0.135 ms

3  10.42.169.205 (10.42.169.205)  0.461 ms  0.160 ms  0.134 ms

 

bash-4.4# traceroute  message-router

traceroute to message-router (10.43.1.20), 30 hops max, 46 byte packets

1  10.42.0.1 (10.42.0.1)  0.009 ms  0.005 ms  0.005 ms

2  itpat1ng505.palermo.italtel.it (138.132.168.173)  0.344 ms  2.211 ms  1.910 ms     ß 138.132.168.X  is VM public network

 3  138.132.169.2 (138.132.169.2)  5.063 ms  3.859 ms  3.934 ms

4  *  *  *

5  *  *  *

6  *  *  *

 

traceroute to message-router-kafka (10.43.148.154), 30 hops max, 46 byte packets

1  10.42.0.1 (10.42.0.1)  0.006 ms  0.005 ms  0.004 ms

2  itpat1ng505.palermo.italtel.it (138.132.168.173)  0.391 ms  0.337 ms  0.314 ms

3  138.132.169.2 (138.132.169.2)  0.803 ms  0.748 ms  0.807 ms

4  *  *  *

5  *  *  *

6  *  *  *

 

It seems that I cannot reach NodePort or ClusterIP inside a POD. This is routing table inside POD:

 

bash-4.4# netstat -rn

Kernel IP routing table

Destination     Gateway         Genmask         Flags   MSS Window  irtt Iface

0.0.0.0         10.42.0.1       0.0.0.0         UG        0 0          0 eth0

10.42.0.0       0.0.0.0         255.255.0.0     U         0 0          0 eth0

 

What can I check on Kubernetes Cluster ?

 

Thanks.

Agostino.

 

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/


Re: dmaap-message-router NodePort not recheable

Brian Freeman
 

Hmm

 

Those two SDC-BE messages are for clients trying to register so I think that is expected until SDC registers with DMaaP/MR.

Is there  a SDC-BE errror on its communication with DMaaP/MR ?

 

The other thing to try is a redeploy of dev-dmaap (remove dockerdata-nfs/dev-dmaap to clean out the topic registrations)

And dev-sdc (again just to start them fresh)

 

Brian

 

 

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Monday, March 04, 2019 5:21 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

Hi, after redeploy of SDC, healtcheck return “Dmaap: None” in SDC health Check, and Distribution of a Service returns POL5000 error.

 

In SDC-FE error log files I see:

 

2019-03-04T09:52:38.847Z        [qtp215145189-44]       INFO    o.o.sdc.fe.servlets.FeProxyServlet      timer=12        ErrorCategory=INFO      RequestId=null ServiceName=SDC catalog serviceInstanceID=null  ErrorCode=0     uuid=599a9bb3-d3c8-4cea-a926-ea6a11762a63       userId=op0001   localAddr=10.42.236.172        remoteAddr=10.42.98.28  SC="500"

 

 

And in SDC-BE error log file I see these messages:

 

2019-03-04T10:15:21.209Z        [qtp215145189-16]       INFO    o.o.sdc.be.filters.BeServletFilter      AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "clamp" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"       AlertSeverity=0 ElapsedTime=96  EndTimestamp=2019-03-04 10:15:21.208Z  auditOn=false   ServerFQDN=dev-sdc-sdc-be-656bd64b9b-5b89b      StatusCode=ERROR        timer=96        ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter ResponseDescription=Internal Server Error       ResponseCode=500        InstanceUUID=clamp      RequestId=8c56c7a2-da62-4598-963d-b5ed13673fce PartnerName=Apache-HttpClient/4.5.6 (Java/1.8.0_181)    TargetEntity=registerInDistributionEngine       CustomField1=POST: https://sdc-be.onap:8443/sdc/v1/registerForDistribution     CustomField2=500        AuditBeginTimestamp=2019-03-04 10:15:21.112Z    RemoteHost=10.42.216.84        ErrorCategory=INFO      ServerIPAddress=10.42.13.89     ServiceName=/v1/registerForDistribution ErrorCode=0     POST /sdc/v1/registerForDistribution HTTP/1.1 SC="500"

 

2019-03-04T10:15:24.740Z        [qtp215145189-20]       ERROR   o.o.s.c.config.EcompErrorLogUtil        alarmSeverity=MAJOR     AuditBeginTimestamp=2019-03-04 10:15:24.690Z   AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "policy" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"      RequestId=ab5939bb-6780-42e7-b63a-54381b74c352  ErrorCategory=ERROR     ServerIPAddress=10.42.13.89     ServiceName=/v1/registerForDistribution        ErrorCode=500   PartnerName=Apache-HttpClient/4.5.5 (Java/1.8.0_171)    auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-5b89b    TargetEntity=registerInDistributionEngine        Error occured in Distribution Engine. Failed operation: registration validation failed

 

 

Any other check to do ?

 

Thank.

Agos.

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:39
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

I’d do a helm delete dev-sdc –purge

Delete /dockerdata-nfs/dev-so

Configm pv/pvc/pod are gone

Then

 

helm deploy dev-sdc local/onap -f /root/oom/kubernetes/onap/resources/environments/public-cloud.yaml -f /root/integration-override.yaml --namespace onap  --verbose

 

(or whatever your override files are)

 

Looks like SDC came up before dmaap and is confused.

 

There are some less intrusive things to try but you need SDC to Pass Health Check (with DMaaP Up from its perspective)

Basic SDC Health Check                                                (DMaaP:UP)| PASS |

 

 

Brian

 

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:30 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

It works.

 

curl -X POST http://138.132.168.85:30227/events/TEST_TOPIC -H 'cache-control: no-cache'   -H 'content-type: application/json'  -H 'postman-token: 1c679102-85e8-f1a2-e708-3e6d84f8ea06' -d '{ "test": "success",                "timestamp": "1/1/2020" }'

{

    "serverTimeMs": 1,

    "count": 1

 

curl -X GET 'http://138.132.168.85:30227/events/TEST_TOPIC/g1/c3?timeout=5000' -H 'accept: application/json'  -H 'cache-control: no-cache'  -H 'postman-token: 04778117-fd44-0cac-b70c-ef2a2c3024af'                       

["{\"test\":\"success\",\"timestamp\":\"1/1/2020\"}"]

 

Agos.

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:21
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Casablanca.

 

OK

 

Use curl or POSTMAN to write to a TEST_TOPIC (unauthenticated topics are created on demand)

(replace 10.12.5.13 with one of your k8 host IPs) – dont need the postman-token and modify for your environment and preferences etc.

 

curl -X POST \

  http://10.12.5.13:30227/events/TEST_TOPIC \

  -H 'cache-control: no-cache' \

  -H 'content-type: application/json' \

  -H 'postman-token: 1c679102-85e8-f1a2-e708-3e6d84f8ea06' \

  -d '{ "test": "success",

               "timestamp": "1/1/2020"

}'

 

The do a GET

 

curl -X GET \

  'http://10.12.5.13:30227/events/TEST_TOPIC/g1/c3?timeout=5000' \

  -H 'accept: application/json' \

  -H 'cache-control: no-cache' \

  -H 'postman-token: 04778117-fd44-0cac-b70c-ef2a2c3024af'

 

 

You should get the test/timestamp object back on the GET (have to execute the POST/GET twice on the initial topic create)

 

This is to confirm that Mesage Router is internally talking to itself correctly.

 

 

Brian

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:13 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

This is the output:

 

Executing robot tests at log level TRACE

[ ERROR ] Suite 'Testsuites' contains no tests with tag 'healthmr'.

 

Try --help for usage information.

command terminated with exit code 252

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:12
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Please try ./ete-k8s.sh onap healthmr

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:09 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

I didn’t find healthmr test but only health.

(

./ete-k8s.sh onap

Usage: ete-k8s.sh [namespace] [ health | healthdist | distribute | instantiate | instantiateVFWCL | instantiateDemoVFWCL |  | portal ] )

 

The command ./ete-k8s.sh onap health reports the list below (  Basic DMAAP Message Router Health Check = PASS )                            

In my environment there are some PODs not in Running state:

 

dev-aai-aai-data-router-5d55646cdc-cc62v                      1/2       CrashLoopBackOff   1084       4d        10.42.79.203    onapkm3   <none>

dev-appc-appc-ansible-server-76fcf9454d-8km9d                 0/1       CrashLoopBackOff   1656       6d        10.42.212.202   onapkm0   <none>

dev-oof-oof-has-api-585497f5-ktjsv                            0/1       Init:0/3           1085       8d        10.42.86.82     onapkm0   <none>

dev-oof-oof-has-controller-9469b9ff8-td4k9                    0/1       Init:1/3           945        8d        10.42.5.110     onapkm2   <none>

dev-oof-oof-has-data-d559897dc-4lmkt                          0/1       Init:1/4           1091       8d        10.42.199.220   onapkm3   <none>

dev-oof-oof-has-healthcheck-jq9xq                             0/1       Init:0/1           1092       8d        10.42.242.145   onapkm3   <none>

dev-oof-oof-has-reservation-868c7c88ff-pv79n                  0/1       Init:1/4           1081       8d        10.42.176.61    onapkm1   <none>

dev-oof-oof-has-solver-6f8bc6fdf4-tw4cj                       0/1       Init:1/4           1084       8d        10.42.29.154    onapkm0   <none>

dev-sdnc-sdnc-ansible-server-7c76f965c6-hqtzl                 0/1       CrashLoopBackOff   1844       8d        10.42.202.36    onapkm3   <none>

dev-sdnc-sdnc-ueb-listener-6d74459c6-tdqhc                    0/1       CrashLoopBackOff   542        1d        10.42.219.51    onapkm2   <none>

 

and multicloud is not deployed.

 

==============================================================================

Testsuites

==============================================================================

Testsuites.Health-Check :: Testing ecomp components are available via calls.

==============================================================================

Basic A&AI Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic AAF Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic AAF SMS Health Check                                            | PASS |

------------------------------------------------------------------------------

Basic APPC Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic CLI Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic CLAMP Health Check                                              | PASS |

------------------------------------------------------------------------------

Basic DCAE Health Check                                               [ WARN ] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e8102250>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

[ WARN ] Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a8d0>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

[ WARN ] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a850>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

| FAIL |

ConnectionError: HTTPConnectionPool(host='dcae-healthcheck.onap', port=80): Max retries exceeded with url: /healthcheck (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a5d0>: Failed to establish a new connection: [Errno -2] Name or service not known',))

------------------------------------------------------------------------------

Basic DMAAP Data Router Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic DMAAP Message Router Health Check                               | PASS |

------------------------------------------------------------------------------

Basic External API NBI Health Check                                   [ WARN ] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80d6c50>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

[ WARN ] Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80c4e10>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

[ WARN ] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e8199f50>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

| FAIL |

ConnectionError: HTTPConnectionPool(host='nbi.onap', port=8080): Max retries exceeded with url: /nbi/api/v3/status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80e5410>: Failed to establish a new connection: [Errno -2] Name or service not known',))

------------------------------------------------------------------------------

Basic Log Elasticsearch Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic Log Kibana Health Check                                         | PASS |

------------------------------------------------------------------------------

Basic Log Logstash Health Check                                       | PASS |

------------------------------------------------------------------------------

Basic Microservice Bus Health Check                                   | PASS |

------------------------------------------------------------------------------

Basic Multicloud API Health Check                                     | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-ocata API Health Check                               | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-pike API Health Check                                | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-titanium_cloud API Health Check                      | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-vio API Health Check                                 | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic OOF-Homing Health Check                                         | FAIL |

Test timeout 10 seconds exceeded.

------------------------------------------------------------------------------

Basic OOF-SNIRO Health Check                                          | PASS |

------------------------------------------------------------------------------

Basic OOF-CMSO Health Check                                           | PASS |

------------------------------------------------------------------------------

Basic Policy Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic Pomba AAI-context-builder Health Check                          | PASS |

------------------------------------------------------------------------------

Basic Pomba SDC-context-builder Health Check                          | PASS |

------------------------------------------------------------------------------

Basic Pomba Network-discovery-context-builder Health Check            | PASS |

------------------------------------------------------------------------------

Basic Portal Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic SDC Health Check                                                (DMaaP:None)| PASS |

------------------------------------------------------------------------------

Basic SDNC Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic SO Health Check                                                 | PASS |

------------------------------------------------------------------------------

Basic UseCaseUI API Health Check                                      | PASS |

------------------------------------------------------------------------------

Basic VFC catalog API Health Check                                    | PASS |

------------------------------------------------------------------------------

Basic VFC emsdriver API Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic VFC gvnfmdriver API Health Check                                | PASS |

------------------------------------------------------------------------------

Basic VFC huaweivnfmdriver API Health Check                           | PASS |

------------------------------------------------------------------------------

Basic VFC jujuvnfmdriver API Health Check                             | PASS |

------------------------------------------------------------------------------

Basic VFC multivimproxy API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC nokiavnfmdriver API Health Check                            | PASS |

------------------------------------------------------------------------------

Basic VFC nokiav2driver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC nslcm API Health Check                                      | PASS |

------------------------------------------------------------------------------

Basic VFC resmgr API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnflcm API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnfmgr API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnfres API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC workflow API Health Check                                   | PASS |

------------------------------------------------------------------------------

Basic VFC ztesdncdriver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC ztevnfmdriver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VID Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic VNFSDK Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic Holmes Rule Management API Health Check                         | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Holmes Engine Management API Health Check                       | FAIL |

502 != 200

------------------------------------------------------------------------------

Testsuites.Health-Check :: Testing ecomp components are available ... | FAIL |

51 critical tests, 41 passed, 10 failed

51 tests total, 41 passed, 10 failed

==============================================================================

Testsuites                                                            | FAIL |

51 critical tests, 41 passed, 10 failed

51 tests total, 41 passed, 10 failed

==============================================================================

Output:  /share/logs/0001_ete_health/output.xml

Log:     /share/logs/0001_ete_health/log.html

Report:  /share/logs/0001_ete_health/report.html

command terminated with exit code 10

 

 

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 14:49
A: onap-discuss@...; Calamita Agostino <agostino.calamita@...>
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Try a POST to make sure you can write to message router.

I doubt its connectivity.

 

If you are on master branch – try ./ete-k8s.sh onap healthmr to test a write/read to a test topic.

 

(do it twice since the first time it creates a test topic and kafka doesnt forward the message till both the publisher and the subscriber have connected)

 

Brian

 

 

From: onap-discuss@... <onap-discuss@...> On Behalf Of Calamita Agostino
Sent: Friday, March 01, 2019 4:30 AM
To: onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

I tried to execute a wget command from sdc-be POD to message-router REST API and I see that dmaap-message-router is reacheable from sdc-be.

 

This is the result:

 

# kubectl exec -it  dev-sdc-sdc-be-656bd64b9b-jh57x  -n onap -- /bin/bash

 

bash-4.4# wget "http://message-router:3904/topics"

Connecting to message-router:3904 (10.43.1.20:3904)

topics               100% |*******************************|   131   0:00:00 ETA

bash-4.4# cat topics

{"topics": [

    "__consumer_offsets",

    "champRawEvents",

    "SDC-DISTR-NOTIF-TOPIC-AUTO",

    "org.onap.dmaap.mr.PNF_READY"

]}bash-4.4#

 

But audit.log of sdc-be, after “Distribution Service” action from Portal , says:

 

2019-03-01T08:32:07.986Z        [qtp215145189-323354]   INFO    o.o.sdc.be.filters.BeServletFilter     

ResponseCode=500        InstanceUUID=null       RequestId=d2f65e19-b07b-4266-8be2-f170aba42fb1  AlertSeverity=0 ElapsedTime=3  

EndTimestamp=2019-03-01 08:32:07.986Z   PartnerName=op0001      auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      

StatusCode=ERROR        TargetEntity=Distribution Engine is DOWN       

CustomField1=POST: http://sdc-be.onap:8080/sdc2/rest/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate       

timer=3 CustomField2=500        AuditBeginTimestamp=2019-03-01 08:32:07.983Z    RemoteHost=10.42.194.84 ErrorCategory=ERROR    

ServerIPAddress=10.42.179.134   ServiceName=/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate      

ServiceInstanceId=null   ClassName=org.openecomp.sdc.be.filters.BeServletFilter  ResponseDescription=Internal Server Error      

ErrorCode=500   null

 

 

In the same log file I found a lot of messages like this one:

 

2019-03-01T09:21:31.850Z        [qtp215145189-399996]   INFO    o.o.sdc.be.filters.BeServletFilter      AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "aai" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"  ResponseCode=500        InstanceUUID=aai-ml     RequestId=7f01a5b2-ee38-42c9-b7a4-330f020a4134 AlertSeverity=0  ElapsedTime=169 EndTimestamp=2019-03-01 09:21:31.850Z   PartnerName=Apache-HttpClient/4.5.6 (Java/1.8.0_171)    auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      StatusCode=ERROR        TargetEntity=registerInDistributionEngine       CustomField1=POST: https://sdc-be.onap:8443/sdc/v1/registerForDistribution      timer=169       CustomField2=500        AuditBeginTimestamp=2019-03-01 09:21:31.681Z    RemoteHost=10.42.209.109        ErrorCategory=ERROR     ServerIPAddress=10.42.179.134   ServiceName=/v1/registerForDistribution ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter  ResponseDescription=Internal Server Error       ErrorCode=500   ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "aai" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"

 

Thanks.

 

Da: onap-discuss@... [mailto:onap-discuss@...] Per conto di Calamita Agostino
Inviato: giovedì 28 febbraio 2019 16:13
A: onap-discuss@...
Oggetto: [onap-discuss] dmaap-message-router NodePort not recheable

 

Hi all,

I have an issue releted to connectivity from sdc-be pod and dmaap-message-router.

My installation is Casablanca 3.0.0 on 7 kubernetes VM cluster.

 

All dmaap pods are up and running:

 

dev-dmaap-dbc-pg-0                                            1/1       Running            0          1d        10.42.173.158   onapkm5   <none>

dev-dmaap-dbc-pg-1                                            1/1       Running            0          1d        10.42.188.140   onapkm2   <none>

dev-dmaap-dbc-pgpool-7b748d5894-mr2m9                         1/1       Running            0          1d        10.42.237.193   onapkm3   <none>

dev-dmaap-dbc-pgpool-7b748d5894-n6dks                         1/1       Running            0          1d        10.42.192.244   onapkm2   <none>

dev-dmaap-dmaap-bus-controller-6757c4c86-8rq5p                1/1       Running            0          1d        10.42.185.132   onapkm1   <none>

dev-dmaap-dmaap-dr-db-bb4c67cfd-tm7td                         1/1       Running            0          1d        10.42.152.59    onapkm1   <none>

dev-dmaap-dmaap-dr-node-66c8749959-tpdtf                      1/1       Running            0          1d        10.42.216.13    onapkm2   <none>

dev-dmaap-dmaap-dr-prov-5c766b8d69-qzqn2                      1/1       Running            0          1d        10.42.115.247   onapkm6   <none>

dev-dmaap-message-router-fb9f4bc7d-5z52j                      1/1       Running            0          6h        10.42.138.31    onapkm3   <none>

dev-dmaap-message-router-kafka-5fbc897f48-4bpb6               1/1       Running            0          1d        10.42.78.141    onapkm4   <none>

dev-dmaap-message-router-zookeeper-557954854-8d6p9            1/1       Running            0          1d        10.42.169.205   onapkm1   <none>

 

but when I try to distribute a service, from SDC Portal, I got “Internal Server Error”.

 

SDC-BE log file traces:

 

2019-02-28T08:50:35.318Z        [qtp215145189-159837]   INFO    o.o.sdc.be.filters.BeServletFilter      ResponseCode=500       

InstanceUUID=null RequestId=dab0fd50-b06e-4a65-b4a8-7d7edeae3e01   AlertSeverity=0 ElapsedTime=99  EndTimestamp=2019-02-28 08:50:35.318Z PartnerName=op0001      auditOn=true       ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      StatusCode=ERROR       

TargetEntity=Distribution Engine is DOWN       

CustomField1=POST: http://sdc-be.onap:8080/sdc2/rest/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate  

timer=99        CustomField2=500   AuditBeginTimestamp=2019-02-28 08:50:35.219Z    RemoteHost=10.42.194.84 ErrorCategory=ERROR    

ServerIPAddress=10.42.179.134   ServiceName=/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate  

ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter     ResponseDescription=Internal Server Error      

ErrorCode=500   null

 

Also SDC healthcheck reports that U-EB Cluster is DOWN.

 

Inside SDC-BE POD, I tried to make a traceroute to “message-router-zookeeper” and to “message-router”.

 

This is the result ( the first is OK, the second one NOT OK ):

 

bash-4.4# traceroute  message-router-zookeeper

traceroute to message-router-zookeeper (10.42.169.205), 30 hops max, 46 byte packets

1  10.42.7.46 (10.42.7.46)  0.213 ms  0.005 ms  0.005 ms

2  10.42.190.179 (10.42.190.179)  0.194 ms  0.145 ms  0.135 ms

3  10.42.169.205 (10.42.169.205)  0.461 ms  0.160 ms  0.134 ms

 

bash-4.4# traceroute  message-router

traceroute to message-router (10.43.1.20), 30 hops max, 46 byte packets

1  10.42.0.1 (10.42.0.1)  0.009 ms  0.005 ms  0.005 ms

2  itpat1ng505.palermo.italtel.it (138.132.168.173)  0.344 ms  2.211 ms  1.910 ms     ß 138.132.168.X  is VM public network

 3  138.132.169.2 (138.132.169.2)  5.063 ms  3.859 ms  3.934 ms

4  *  *  *

5  *  *  *

6  *  *  *

 

traceroute to message-router-kafka (10.43.148.154), 30 hops max, 46 byte packets

1  10.42.0.1 (10.42.0.1)  0.006 ms  0.005 ms  0.004 ms

2  itpat1ng505.palermo.italtel.it (138.132.168.173)  0.391 ms  0.337 ms  0.314 ms

3  138.132.169.2 (138.132.169.2)  0.803 ms  0.748 ms  0.807 ms

4  *  *  *

5  *  *  *

6  *  *  *

 

It seems that I cannot reach NodePort or ClusterIP inside a POD. This is routing table inside POD:

 

bash-4.4# netstat -rn

Kernel IP routing table

Destination     Gateway         Genmask         Flags   MSS Window  irtt Iface

0.0.0.0         10.42.0.1       0.0.0.0         UG        0 0          0 eth0

10.42.0.0       0.0.0.0         255.255.0.0     U         0 0          0 eth0

 

What can I check on Kubernetes Cluster ?

 

Thanks.

Agostino.

 

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/


FW: Updated invitation: Lab access and capacity #dublin @ Mon Mar 4, 2019 9:30am - 10:30am (MST) (yang.xu3@huawei.com) #dublin

Yang Xu
 

 
 
-----Original Appointment-----

From: jbaker@... [mailto:jbaker@...]
Sent: Monday, March 04, 2019 8:35 AM
To: jbaker@...; rb2745@...; kpaul@...; eddy.raineri@...; Gary Wu; Yang Xu (Yang, Fixed Network); bf1936@...
Subject: Updated invitation: Lab access and capacity #dublin @ Mon Mar 4, 2019 9:30am - 10:30am (MST) (yang.xu3@...)
When: Monday, March 04, 2019 11:30 AM-12:30 PM (UTC-05:00) Eastern Time (US & Canada).
Where: https://zoom.us/j/394640816
 
 
This event has been changed.
Lab access and capacity #dublin
When         Changed: Mon Mar 4, 2019 9:30am – 10:30am Mountain Time - Denver        
Where         https://zoom.us/j/394640816 (map)        
Calendar         yang.xu3@...        
Who
                jbaker@... - organizer        
        rb2745@...        
        kpaul@...        
        eddy.raineri@...        
        gary.i.wu@...        
        yang.xu3@...        
        bf1936@...        
 
       
Several Dublin release risks are tied to lab access. Let's get a common understanding and a publicized plan for lab space and any known future changes.
I've booked an hour and hope to finish in 30 min :-)
──────────
Jim Baker (LFN) is inviting you to a scheduled Zoom meeting.
Join Zoom Meeting
https://zoom.us/j/394640816
One tap mobile
+16699006833,,394640816# US (San Jose)
+16465588656,,394640816# US (New York)
Dial by your location
+1 669 900 6833 US (San Jose)
+1 646 558 8656 US (New York)
855 880 1246 US Toll-free
877 369 0926 US Toll-free
Meeting ID: 394 640 816
Find your local number: https://zoom.us/u/adoCBQyqvE
──────────
Going (yang.xu3@...)?   Yes - Maybe - No    more options »
Invitation from Google Calendar
You are receiving this courtesy email at the account yang.xu3@... because you are an attendee of this event.
To stop receiving future updates for this event, decline this event. Alternatively you can sign up for a Google account at https://www.google.com/calendar/ and control your notification settings for your entire calendar.
Forwarding this invitation could allow any recipient to modify your RSVP response. Learn More.
 


Re: Casablanca APPC Readiness probe failed: APPC is not healthy for more than 2 hours

Taka Cho
 

Hi,

 

In Casablanca, Those ODL features were installed when you are installing APPC container. We moved ODL feature installation when APPC container gets built in R4, so it won’t take that long in R4.

 

But 1802 seconds installing is just too long. Did you check your k8s env or VM capacity?

 

Taka

 

From: onap-discuss@... <onap-discuss@...> On Behalf Of Vivekanandan Muthukrishnan
Sent: Sunday, March 3, 2019 6:23 AM
To: onap-discuss@...
Subject: [onap-discuss] Casablanca APPC Readiness probe failed: APPC is not healthy for more than 2 hours

 

Hi All,

 

It seems like APPC POD dev-appc-appc-0 (Container aapc) is taking more time to install KARAF bundles. And dev-appc-appc-ansible-server is keep getting restarted.

 

It this expected ? Are there any workaround to offload KARAF packages from local maven repository?

 

# All APPC PODs

$ kubectl get pods -n onap | grep appc

dev-appc-appc-0                                               1/2       Running            0          2h

dev-appc-appc-ansible-server-6877b497df-j544r                 0/1       Init:0/1           3          37m

dev-appc-appc-cdt-77bccf4847-fmtpw                            1/1       Running            0          2h

dev-appc-appc-db-0                                            1/1       Running            1          2h

dev-appc-appc-db-1                                            1/1       Running            0          2h

dev-appc-appc-db-2                                            1/1       Running            1          2h

dev-appc-appc-dgbuilder-f7565468-fnrz6                        1/1       Running            0          2h

 

 

# It seems like ODL KARAF features are still getting installed and it is taking more time

$ kubectl logs -n onap dev-appc-appc-0 -c appc

Adding feature url mvn:org.onap.appc/onap-appc-design-services/1.4.4/xml/features

Archive:  /opt/onap/appc/features/appc-interfaces-service/appc-interfaces-service-1.4.4.zip

   creating: /opt/opendaylight/system/com/google/code/gson/gson/2.8.0/

   creating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/

   creating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/1.4.4/

   creating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/

   creating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/1.4.4/

   creating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/

   creating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/1.4.4/

  inflating: /opt/opendaylight/system/com/google/code/gson/gson/2.8.0/_remote.repositories  

  inflating: /opt/opendaylight/system/com/google/code/gson/gson/2.8.0/gson-2.8.0.jar  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/1.4.4/appc-interfaces-service-model-1.4.4.jar  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/1.4.4/_remote.repositories  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-model/maven-metadata-local.xml  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/1.4.4/appc-interfaces-service-bundle-1.4.4.jar  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/1.4.4/_remote.repositories  

  inflating: /opt/opendaylight/system/org/onap/appc/appc-interfaces-service-bundle/maven-metadata-local.xml  

  inflating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/1.4.4/_remote.repositories  

  inflating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/1.4.4/onap-appc-interfaces-service-1.4.4-features.xml  

  inflating: /opt/opendaylight/system/org/onap/appc/onap-appc-interfaces-service/maven-metadata-local.xml  

Adding feature url mvn:org.onap.appc/onap-appc-interfaces-service/1.4.4/xml/features

Installing onap-appc-core

Install of onap-appc-core took 1802 seconds

Sleep Finished

Installing onap-appc-metric

 

$ kubectl describe pod -n onap dev-appc-appc-0

Name:           dev-appc-appc-0

Namespace:      onap

Node:           casablanca03/192.168.122.233

Start Time:     Sun, 03 Mar 2019 08:39:03 +0000

Labels:         app=appc

                controller-revision-hash=dev-appc-appc-69d746947b

                release=dev-appc

Annotations:    <none>

Status:         Running

IP:             10.42.135.188

Controlled By:  StatefulSet/dev-appc-appc

Init Containers:

  appc-readiness:

    Container ID:  docker://219c491421ce1c5f83539e84508a642c0726930fe4078a72fbf0de65d17aa236

    Image:         oomk8s/readiness-check:2.0.0

    Image ID:      docker://sha256:867cb038e1d2445a6e5aedc3b5f970dacc8249ab119d6c2e088e10df886ff51f

    Port:          <none>

    Host Port:     <none>

    Command:

      /root/ready.py

    Args:

      --container-name

      appc-db

    State:          Terminated

      Reason:       Completed

      Exit Code:    0

      Started:      Sun, 03 Mar 2019 08:40:03 +0000

      Finished:     Sun, 03 Mar 2019 08:45:25 +0000

    Ready:          True

    Restart Count:  0

    Environment:

      NAMESPACE:  onap (v1:metadata.namespace)

    Mounts:

      /var/run/secrets/kubernetes.io/serviceaccount from default-token-lw9wt (ro)

Containers:

  appc:

    Container ID:  docker://f1c058a8fb540c20beaffcad0349191877527824a0b6e11d55150694c65d6427

    Ports:         8181/TCP, 1830/TCP

    Host Ports:    0/TCP, 0/TCP

    Command:

      /opt/appc/bin/startODL.sh

    State:          Running

      Started:      Sun, 03 Mar 2019 08:48:29 +0000

    Ready:          False

    Restart Count:  0

    Readiness:      exec [/opt/appc/bin/health_check.sh] delay=10s timeout=1s period=10s #success=1 #failure=3

    Environment:

      MYSQL_ROOT_PASSWORD:  <set to the key 'db-root-password' in secret 'dev-appc-appc'>  Optional: false

      SDNC_CONFIG_DIR:      /opt/onap/appc/data/properties

      APPC_CONFIG_DIR:      /opt/onap/appc/data/properties

      DMAAP_TOPIC_ENV:      SUCCESS

      ENABLE_AAF:           true

      ENABLE_ODL_CLUSTER:   false

      APPC_REPLICAS:        1

    Mounts:

      /etc/localtime from localtime (ro)

      /opt/onap/appc/bin/health_check.sh from onap-appc-bin (rw)

      /opt/onap/appc/bin/installAppcDb.sh from onap-appc-bin (rw)

      /opt/onap/appc/bin/startODL.sh from onap-appc-bin (rw)

      /opt/onap/appc/data/properties/aaa-app-config.xml from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/aaiclient.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/appc.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/cadi.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/dblib.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/data/properties/svclogic.properties from onap-appc-data-properties (rw)

      /opt/onap/appc/svclogic/bin/showActiveGraphs.sh from onap-appc-svclogic-bin (rw)

      /opt/onap/appc/svclogic/config/svclogic.properties from onap-appc-svclogic-config (rw)

      /opt/onap/ccsdk/bin/installSdncDb.sh from onap-sdnc-bin (rw)

      /opt/onap/ccsdk/bin/startODL.sh from onap-sdnc-bin (rw)

      /opt/onap/ccsdk/data/properties/aaiclient.properties from onap-sdnc-data-properties (rw)

      /opt/onap/ccsdk/data/properties/dblib.properties from onap-sdnc-data-properties (rw)

      /opt/onap/ccsdk/data/properties/svclogic.properties from onap-sdnc-data-properties (rw)

      /opt/onap/ccsdk/svclogic/bin/showActiveGraphs.sh from onap-sdnc-svclogic-bin (rw)

      /opt/onap/ccsdk/svclogic/config/svclogic.properties from onap-sdnc-svclogic-config (rw)

      /opt/opendaylight/current/daexim from dev-appc-appc-data (rw)

      /opt/opendaylight/current/etc/org.ops4j.pax.logging.cfg from log-config (rw)

      /var/log/onap from logs (rw)

      /var/run/secrets/kubernetes.io/serviceaccount from default-token-lw9wt (ro)

  filebeat-onap:

    Container ID:   docker://daf7ddc4a3e4945a1a4cab940906022248696e72c90bb15fa01144cacd3a1833

    Image:          docker.elastic.co/beats/filebeat:5.5.0

    Image ID:       docker://sha256:b61327632415b6d374b9f34cea71cb14f9c352e5259140ce6e3c8eaf8becaa1b

    Port:           <none>

    Host Port:      <none>

    State:          Running

      Started:      Sun, 03 Mar 2019 08:48:30 +0000

    Ready:          True

    Restart Count:  0

    Environment:    <none>

    Mounts:

      /usr/share/filebeat/data from data-filebeat (rw)

      /usr/share/filebeat/filebeat.yml from filebeat-conf (rw)

      /var/log/onap from logs (rw)

      /var/run/secrets/kubernetes.io/serviceaccount from default-token-lw9wt (ro)

Conditions:

  Type              Status

  Initialized       True 

  Ready             False 

  ContainersReady   False 

  PodScheduled      True 

Volumes:

  dev-appc-appc-data:

    Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)

    ClaimName:  dev-appc-appc-data-dev-appc-appc-0

    ReadOnly:   false

  localtime:

    Type:          HostPath (bare host directory volume)

    Path:          /etc/localtime

    HostPathType:  

  filebeat-conf:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-filebeat

    Optional:  false

  log-config:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-logging-cfg

    Optional:  false

  logs:

    Type:    EmptyDir (a temporary directory that shares a pod's lifetime)

    Medium:  

  data-filebeat:

    Type:    EmptyDir (a temporary directory that shares a pod's lifetime)

    Medium:  

  onap-appc-data-properties:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-appc-data-properties

    Optional:  false

  onap-appc-svclogic-config:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-appc-svclogic-config

    Optional:  false

  onap-appc-svclogic-bin:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-appc-svclogic-bin

    Optional:  false

  onap-appc-bin:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-appc-bin

    Optional:  false

  onap-sdnc-data-properties:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-sdnc-data-properties

    Optional:  false

  onap-sdnc-svclogic-config:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-sdnc-svclogic-config

    Optional:  false

  onap-sdnc-svclogic-bin:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-sdnc-svclogic-bin

    Optional:  false

  onap-sdnc-bin:

    Type:      ConfigMap (a volume populated by a ConfigMap)

    Name:      dev-appc-appc-onap-sdnc-bin

    Optional:  false

  default-token-lw9wt:

    Type:        Secret (a volume populated by a Secret)

    SecretName:  default-token-lw9wt

    Optional:    false

QoS Class:       BestEffort

Node-Selectors:  <none>

Tolerations:     node.kubernetes.io/not-ready:NoExecute for 300s

                 node.kubernetes.io/unreachable:NoExecute for 300s

Events:

  Type     Reason     Age               From                   Message

  ----     ------     ----              ----                   -------

  Warning  Unhealthy  26m (x9 over 1h)  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ ps -e

++ grep startODL

++ wc -l

+ startODL_status=1

++ grep Waiting

++ wc -l

++ /opt/opendaylight/current/bin/client bundle:list

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  20m (x212 over 1h)  kubelet, casablanca03  (combined from similar events): Readiness probe failed: APPC is not healthy.

++ ps -e

++ grep startODL

++ wc -l

+ startODL_status=1

++ grep Waiting

++ /opt/opendaylight/current/bin/client bundle:list

++ wc -l

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  16m  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ wc -l

++ ps -e

++ grep startODL

+ startODL_status=1

++ /opt/opendaylight/current/bin/client bundle:list

++ grep Waiting

++ wc -l

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  11m (x203 over 1h)  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ ps -e

++ grep startODL

++ wc -l

+ startODL_status=1

++ /opt/opendaylight/current/bin/client bundle:list

++ grep Waiting

++ wc -l

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  5m (x12 over 1h)  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ ps -e

++ wc -l

++ grep startODL

+ startODL_status=1

++ /opt/opendaylight/current/bin/client bundle:list

++ wc -l

++ grep Waiting

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

  Warning  Unhealthy  1m (x31 over 1h)  kubelet, casablanca03  Readiness probe failed: APPC is not healthy.

++ ps -e

++ wc -l

++ grep startODL

+ startODL_status=1

++ /opt/opendaylight/current/bin/client bundle:list

++ grep Waiting

++ wc -l

+ waiting_bundles=0

++ /opt/opendaylight/current/bin/client system:start-level

+ run_level='Level 100'

+ '[' 'Level 100' == 'Level 100' ']'

+ '[' 1 -lt 1 ']'

+ echo APPC is not healthy.

+ exit 1

 

 

=== We can access the karaf

 

$ kubectl exec -it -n onap dev-appc-appc-0 -c appc -- /bin/bash

root@dev-appc-appc-0:/# ls

bin   dev  home  lib64  mnt  proc  run   srv  tmp  var

boot  etc  lib   media  opt  root  sbin  sys  usr

root@dev-appc-appc-0:/# ps -ef | grep java

root       217   156 15 08:48 ?        00:21:44 /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Djava.security.properties=/opt/opendaylight/etc/odl.java.security -Xms128M -Xmx2048m -XX:+UnlockDiagnosticVMOptions -XX:+HeapDumpOnOutOfMemoryError -Dcom.sun.management.jmxremote -Djava.security.egd=file:/dev/./urandom -Djava.endorsed.dirs=/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/endorsed:/usr/lib/jvm/java-8-openjdk-amd64/lib/endorsed:/opt/opendaylight/lib/endorsed -Djava.ext.dirs=/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/ext:/usr/lib/jvm/java-8-openjdk-amd64/lib/ext:/opt/opendaylight/lib/ext -Dkaraf.instances=/opt/opendaylight/instances -Dkaraf.home=/opt/opendaylight -Dkaraf.base=/opt/opendaylight -Dkaraf.data=/opt/opendaylight/data -Dkaraf.etc=/opt/opendaylight/etc -Dkaraf.restart.jvm.supported=true -Djava.io.tmpdir=/opt/opendaylight/data/tmp -Djava.util.logging.config.file=/opt/opendaylight/etc/java.util.logging.properties -Dkaraf.startLocalConsole=false -Dkaraf.startRemoteShell=true -classpath /opt/opendaylight/lib/boot/org.apache.karaf.diagnostic.boot-4.1.5.jar:/opt/opendaylight/lib/boot/org.apache.karaf.jaas.boot-4.1.5.jar:/opt/opendaylight/lib/boot/org.apache.karaf.main-4.1.5.jar:/opt/opendaylight/lib/boot/org.osgi.core-6.0.0.jar org.apache.karaf.main.Main

root     10908  1660  0 10:53 ?        00:00:04 /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Dkaraf.instances=/opt/opendaylight/instances -Dkaraf.home=/opt/opendaylight -Dkaraf.base=/opt/opendaylight -Dkaraf.etc=/opt/opendaylight/etc -Djava.io.tmpdir=/opt/opendaylight/data/tmp -Djava.util.logging.config.file=/opt/opendaylight/etc/java.util.logging.properties -classpath /opt/opendaylight/system/org/apache/karaf/org.apache.karaf.client/4.1.5/org.apache.karaf.client-4.1.5.jar:/opt/opendaylight/system/org/apache/sshd/sshd-core/1.6.0/sshd-core-1.6.0.jar:/opt/opendaylight/system/org/fusesource/jansi/jansi/1.17/jansi-1.17.jar:/opt/opendaylight/system/org/jline/jline/3.6.0/jline-3.6.0.jar:/opt/opendaylight/system/org/slf4j/slf4j-api/1.7.12/slf4j-api-1.7.12.jar org.apache.karaf.client.Main feature:install -r onap-appc-metric

root     25586 25509  0 11:10 ?        00:00:00 /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Dkaraf.instances=/opt/opendaylight/current/instances -Dkaraf.home=/opt/opendaylight/current -Dkaraf.base=/opt/opendaylight/current -Dkaraf.etc=/opt/opendaylight/current/etc -Djava.io.tmpdir=/opt/opendaylight/current/data/tmp -Djava.util.logging.config.file=/opt/opendaylight/current/etc/java.util.logging.properties -classpath /opt/opendaylight/current/system/org/apache/karaf/org.apache.karaf.client/4.1.5/org.apache.karaf.client-4.1.5.jar:/opt/opendaylight/current/system/org/apache/sshd/sshd-core/1.6.0/sshd-core-1.6.0.jar:/opt/opendaylight/current/system/org/fusesource/jansi/jansi/1.17/jansi-1.17.jar:/opt/opendaylight/current/system/org/jline/jline/3.6.0/jline-3.6.0.jar:/opt/opendaylight/current/system/org/slf4j/slf4j-api/1.7.12/slf4j-api-1.7.12.jar org.apache.karaf.client.Main system:start-level

root     25637 25428  0 11:10 ?        00:00:00 grep --color=auto java

 

 

 

 

 

 

 

 


Puccini TOSCA compiler now supports Heat templates (HOT)

Tal Liron
 

The latest release of Puccini, 0.6, features many big fixes, support for "quirks" (varying behavior according to varying interpretations of gaps in the TOSCA spec), but probably most intriguing is support for the HOT (Heat Orchestration Template) language in addition to TOSCA 1.2 and TOSCA 1.1.

HOT? What? Why?

First, a quick recap of what Puccini does. It compiles TOSCA into an intermediary form, called Clout, which is a normalized, flattened version of the template described by TOSCA. It is still a "template" at this stage, rather than an instantiated deployment, but it contains JavaScript scriptlets to do the work of actually instantiating the template into a cloud deployment. This work usually means integrating with an orchestrator, such as Ansible or Kubernetes. In Puccini you "bring your own orchestrator" (BYOO). (And, yes, Kubernetes is best understood as an orchestrator in this case, not an infrastructure manager.) If you're just doing a "day 1" installation then you won't even notice this intermediary form. It comes to play in "day 2" topology-changing toolchains, such as scaling and healing, while maintaining the authored rules (requirements-and-capabilities), policies, and boundaries expressed in TOSCA. The scriptlets will be re-run by the toolchain to dynamically re-orchestrate the updated topology.

Among other profiles, Puccini comes with an initial OpenStack profile, which intends to model the resources similarly to HOT, but with the much enhanced object-oriented, strictly-typed grammar of TOSCA. The "bring your own orchestrator" approach means that you don't need Heat: the embedded scriptlet outputs Ansible playbooks instead to deploy your template. The default end result is the same, however Ansible has some advantages over Heat, specifically opening the door to fine-grained custom integrations with your orchestration environment, such as OSS/BSS, security audits, external policy frameworks, etc. For all its strengths, Heat is very much tied to OpenStack and cannot so easily interact with other systems.

So, where does HOT fit in?

Puccini can now compile HOT in the same way it compiles TOSCA with the OpenStack profile, with the resulting Clout being indistinguishable if we assume the same resources and topology. This feature was a classic case of "low-hanging fruit", relatively easy to implement due to HOT's grammar being so similar to TOSCA's while being vastly simpler (HOT has no type system and only a handful of entities). There was very little code that needed to be added to Puccini. (It would likewise be relatively easy to add support for the Cloudify DSL if there's interest. The basic compilation engine in Puccini is designed for any TOSCA-like language or dialect.)

Understanding the "why" might be less immediately obvious. Wouldn't you be using Puccini specifically because it's a TOSCA compiler? Well, possibly, but you might also be choosing it for its orchestration integrations. Using HOT with Puccini gives you the same Ansible playbooks and the same "day 2" opportunities as it would with TOSCA and the OpenStack profile.

Generally speaking, if you're using Puccini then TOSCA is a better choice than HOT for new OpenStack templating projects. But if you are already invested in Heat, this new feature can provide a less jarring migration path towards using Ansible and other orchestrators.

Which orchestrators, exactly? In ONAP's world it could be VF-C, App-C, Multi-Cloud, or existing VIM investments by telcos. But you might already guess which orchestrator is on my mind. In my "Let's Move Everything to Kubernetes" presentation I argued that we could soon be able to use Kubernetes (with KubeVirt, Multus, and other additions) as an orchestrator instead of Heat, allowing us to support the most commonly used OpenStack resources -- Nova servers, Neutron networks, Mistral workflows -- without actually requiring OpenStack. Imagine taking an existing Heat template for a VNF and deploying it in a single command directly to Kubernetes. We're not quite there yet, but the pieces are coming together.


[SO] CSAR related queries

Sunil Kumar
 

Hi SO Team,

 

1.       Where could I find the CSAR file extraction code in SO?

 

2.       If we are extracting CSAR file, are we storing the blueprint-name and blueprint-version in database? If yes, which table?

 

Please guide me on these queries.


--
Thanks & Regards
Sunil Biradar


R: [onap-discuss] dmaap-message-router NodePort not recheable

Calamita Agostino
 

Hi, after redeploy of SDC, healtcheck return “Dmaap: None” in SDC health Check, and Distribution of a Service returns POL5000 error.

 

In SDC-FE error log files I see:

 

2019-03-04T09:52:38.847Z        [qtp215145189-44]       INFO    o.o.sdc.fe.servlets.FeProxyServlet      timer=12        ErrorCategory=INFO      RequestId=null ServiceName=SDC catalog serviceInstanceID=null  ErrorCode=0     uuid=599a9bb3-d3c8-4cea-a926-ea6a11762a63       userId=op0001   localAddr=10.42.236.172        remoteAddr=10.42.98.28  SC="500"

 

 

And in SDC-BE error log file I see these messages:

 

2019-03-04T10:15:21.209Z        [qtp215145189-16]       INFO    o.o.sdc.be.filters.BeServletFilter      AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "clamp" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"       AlertSeverity=0 ElapsedTime=96  EndTimestamp=2019-03-04 10:15:21.208Z  auditOn=false   ServerFQDN=dev-sdc-sdc-be-656bd64b9b-5b89b      StatusCode=ERROR        timer=96        ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter ResponseDescription=Internal Server Error       ResponseCode=500        InstanceUUID=clamp      RequestId=8c56c7a2-da62-4598-963d-b5ed13673fce PartnerName=Apache-HttpClient/4.5.6 (Java/1.8.0_181)    TargetEntity=registerInDistributionEngine       CustomField1=POST: https://sdc-be.onap:8443/sdc/v1/registerForDistribution     CustomField2=500        AuditBeginTimestamp=2019-03-04 10:15:21.112Z    RemoteHost=10.42.216.84        ErrorCategory=INFO      ServerIPAddress=10.42.13.89     ServiceName=/v1/registerForDistribution ErrorCode=0     POST /sdc/v1/registerForDistribution HTTP/1.1 SC="500"

 

2019-03-04T10:15:24.740Z        [qtp215145189-20]       ERROR   o.o.s.c.config.EcompErrorLogUtil        alarmSeverity=MAJOR     AuditBeginTimestamp=2019-03-04 10:15:24.690Z   AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "policy" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"      RequestId=ab5939bb-6780-42e7-b63a-54381b74c352  ErrorCategory=ERROR     ServerIPAddress=10.42.13.89     ServiceName=/v1/registerForDistribution        ErrorCode=500   PartnerName=Apache-HttpClient/4.5.5 (Java/1.8.0_171)    auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-5b89b    TargetEntity=registerInDistributionEngine        Error occured in Distribution Engine. Failed operation: registration validation failed

 

 

Any other check to do ?

 

Thank.

Agos.

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:39
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

I’d do a helm delete dev-sdc –purge

Delete /dockerdata-nfs/dev-so

Configm pv/pvc/pod are gone

Then

 

helm deploy dev-sdc local/onap -f /root/oom/kubernetes/onap/resources/environments/public-cloud.yaml -f /root/integration-override.yaml --namespace onap  --verbose

 

(or whatever your override files are)

 

Looks like SDC came up before dmaap and is confused.

 

There are some less intrusive things to try but you need SDC to Pass Health Check (with DMaaP Up from its perspective)

Basic SDC Health Check                                                (DMaaP:UP)| PASS |

 

 

Brian

 

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:30 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

It works.

 

curl -X POST http://138.132.168.85:30227/events/TEST_TOPIC -H 'cache-control: no-cache'   -H 'content-type: application/json'  -H 'postman-token: 1c679102-85e8-f1a2-e708-3e6d84f8ea06' -d '{ "test": "success",                "timestamp": "1/1/2020" }'

{

    "serverTimeMs": 1,

    "count": 1

 

curl -X GET 'http://138.132.168.85:30227/events/TEST_TOPIC/g1/c3?timeout=5000' -H 'accept: application/json'  -H 'cache-control: no-cache'  -H 'postman-token: 04778117-fd44-0cac-b70c-ef2a2c3024af'                       

["{\"test\":\"success\",\"timestamp\":\"1/1/2020\"}"]

 

Agos.

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:21
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Casablanca.

 

OK

 

Use curl or POSTMAN to write to a TEST_TOPIC (unauthenticated topics are created on demand)

(replace 10.12.5.13 with one of your k8 host IPs) – dont need the postman-token and modify for your environment and preferences etc.

 

curl -X POST \

  http://10.12.5.13:30227/events/TEST_TOPIC \

  -H 'cache-control: no-cache' \

  -H 'content-type: application/json' \

  -H 'postman-token: 1c679102-85e8-f1a2-e708-3e6d84f8ea06' \

  -d '{ "test": "success",

               "timestamp": "1/1/2020"

}'

 

The do a GET

 

curl -X GET \

  'http://10.12.5.13:30227/events/TEST_TOPIC/g1/c3?timeout=5000' \

  -H 'accept: application/json' \

  -H 'cache-control: no-cache' \

  -H 'postman-token: 04778117-fd44-0cac-b70c-ef2a2c3024af'

 

 

You should get the test/timestamp object back on the GET (have to execute the POST/GET twice on the initial topic create)

 

This is to confirm that Mesage Router is internally talking to itself correctly.

 

 

Brian

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:13 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

This is the output:

 

Executing robot tests at log level TRACE

[ ERROR ] Suite 'Testsuites' contains no tests with tag 'healthmr'.

 

Try --help for usage information.

command terminated with exit code 252

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 15:12
A: Calamita Agostino <agostino.calamita@...>; onap-discuss@...
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Please try ./ete-k8s.sh onap healthmr

 

From: Calamita Agostino <agostino.calamita@...>
Sent: Friday, March 01, 2019 9:09 AM
To: FREEMAN, BRIAN D <bf1936@...>; onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

I didn’t find healthmr test but only health.

(

./ete-k8s.sh onap

Usage: ete-k8s.sh [namespace] [ health | healthdist | distribute | instantiate | instantiateVFWCL | instantiateDemoVFWCL |  | portal ] )

 

The command ./ete-k8s.sh onap health reports the list below (  Basic DMAAP Message Router Health Check = PASS )                            

In my environment there are some PODs not in Running state:

 

dev-aai-aai-data-router-5d55646cdc-cc62v                      1/2       CrashLoopBackOff   1084       4d        10.42.79.203    onapkm3   <none>

dev-appc-appc-ansible-server-76fcf9454d-8km9d                 0/1       CrashLoopBackOff   1656       6d        10.42.212.202   onapkm0   <none>

dev-oof-oof-has-api-585497f5-ktjsv                            0/1       Init:0/3           1085       8d        10.42.86.82     onapkm0   <none>

dev-oof-oof-has-controller-9469b9ff8-td4k9                    0/1       Init:1/3           945        8d        10.42.5.110     onapkm2   <none>

dev-oof-oof-has-data-d559897dc-4lmkt                          0/1       Init:1/4           1091       8d        10.42.199.220   onapkm3   <none>

dev-oof-oof-has-healthcheck-jq9xq                             0/1       Init:0/1           1092       8d        10.42.242.145   onapkm3   <none>

dev-oof-oof-has-reservation-868c7c88ff-pv79n                  0/1       Init:1/4           1081       8d        10.42.176.61    onapkm1   <none>

dev-oof-oof-has-solver-6f8bc6fdf4-tw4cj                       0/1       Init:1/4           1084       8d        10.42.29.154    onapkm0   <none>

dev-sdnc-sdnc-ansible-server-7c76f965c6-hqtzl                 0/1       CrashLoopBackOff   1844       8d        10.42.202.36    onapkm3   <none>

dev-sdnc-sdnc-ueb-listener-6d74459c6-tdqhc                    0/1       CrashLoopBackOff   542        1d        10.42.219.51    onapkm2   <none>

 

and multicloud is not deployed.

 

==============================================================================

Testsuites

==============================================================================

Testsuites.Health-Check :: Testing ecomp components are available via calls.

==============================================================================

Basic A&AI Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic AAF Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic AAF SMS Health Check                                            | PASS |

------------------------------------------------------------------------------

Basic APPC Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic CLI Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic CLAMP Health Check                                              | PASS |

------------------------------------------------------------------------------

Basic DCAE Health Check                                               [ WARN ] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e8102250>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

[ WARN ] Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a8d0>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

[ WARN ] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a850>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /healthcheck

| FAIL |

ConnectionError: HTTPConnectionPool(host='dcae-healthcheck.onap', port=80): Max retries exceeded with url: /healthcheck (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e817a5d0>: Failed to establish a new connection: [Errno -2] Name or service not known',))

------------------------------------------------------------------------------

Basic DMAAP Data Router Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic DMAAP Message Router Health Check                               | PASS |

------------------------------------------------------------------------------

Basic External API NBI Health Check                                   [ WARN ] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80d6c50>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

[ WARN ] Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80c4e10>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

[ WARN ] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e8199f50>: Failed to establish a new connection: [Errno -2] Name or service not known',)': /nbi/api/v3/status

| FAIL |

ConnectionError: HTTPConnectionPool(host='nbi.onap', port=8080): Max retries exceeded with url: /nbi/api/v3/status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f03e80e5410>: Failed to establish a new connection: [Errno -2] Name or service not known',))

------------------------------------------------------------------------------

Basic Log Elasticsearch Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic Log Kibana Health Check                                         | PASS |

------------------------------------------------------------------------------

Basic Log Logstash Health Check                                       | PASS |

------------------------------------------------------------------------------

Basic Microservice Bus Health Check                                   | PASS |

------------------------------------------------------------------------------

Basic Multicloud API Health Check                                     | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-ocata API Health Check                               | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-pike API Health Check                                | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-titanium_cloud API Health Check                      | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Multicloud-vio API Health Check                                 | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic OOF-Homing Health Check                                         | FAIL |

Test timeout 10 seconds exceeded.

------------------------------------------------------------------------------

Basic OOF-SNIRO Health Check                                          | PASS |

------------------------------------------------------------------------------

Basic OOF-CMSO Health Check                                           | PASS |

------------------------------------------------------------------------------

Basic Policy Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic Pomba AAI-context-builder Health Check                          | PASS |

------------------------------------------------------------------------------

Basic Pomba SDC-context-builder Health Check                          | PASS |

------------------------------------------------------------------------------

Basic Pomba Network-discovery-context-builder Health Check            | PASS |

------------------------------------------------------------------------------

Basic Portal Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic SDC Health Check                                                (DMaaP:None)| PASS |

------------------------------------------------------------------------------

Basic SDNC Health Check                                               | PASS |

------------------------------------------------------------------------------

Basic SO Health Check                                                 | PASS |

------------------------------------------------------------------------------

Basic UseCaseUI API Health Check                                      | PASS |

------------------------------------------------------------------------------

Basic VFC catalog API Health Check                                    | PASS |

------------------------------------------------------------------------------

Basic VFC emsdriver API Health Check                                  | PASS |

------------------------------------------------------------------------------

Basic VFC gvnfmdriver API Health Check                                | PASS |

------------------------------------------------------------------------------

Basic VFC huaweivnfmdriver API Health Check                           | PASS |

------------------------------------------------------------------------------

Basic VFC jujuvnfmdriver API Health Check                             | PASS |

------------------------------------------------------------------------------

Basic VFC multivimproxy API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC nokiavnfmdriver API Health Check                            | PASS |

------------------------------------------------------------------------------

Basic VFC nokiav2driver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC nslcm API Health Check                                      | PASS |

------------------------------------------------------------------------------

Basic VFC resmgr API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnflcm API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnfmgr API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC vnfres API Health Check                                     | PASS |

------------------------------------------------------------------------------

Basic VFC workflow API Health Check                                   | PASS |

------------------------------------------------------------------------------

Basic VFC ztesdncdriver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VFC ztevnfmdriver API Health Check                              | PASS |

------------------------------------------------------------------------------

Basic VID Health Check                                                | PASS |

------------------------------------------------------------------------------

Basic VNFSDK Health Check                                             | PASS |

------------------------------------------------------------------------------

Basic Holmes Rule Management API Health Check                         | FAIL |

502 != 200

------------------------------------------------------------------------------

Basic Holmes Engine Management API Health Check                       | FAIL |

502 != 200

------------------------------------------------------------------------------

Testsuites.Health-Check :: Testing ecomp components are available ... | FAIL |

51 critical tests, 41 passed, 10 failed

51 tests total, 41 passed, 10 failed

==============================================================================

Testsuites                                                            | FAIL |

51 critical tests, 41 passed, 10 failed

51 tests total, 41 passed, 10 failed

==============================================================================

Output:  /share/logs/0001_ete_health/output.xml

Log:     /share/logs/0001_ete_health/log.html

Report:  /share/logs/0001_ete_health/report.html

command terminated with exit code 10

 

 

 

Da: FREEMAN, BRIAN D [mailto:bf1936@...]
Inviato: venerdì 1 marzo 2019 14:49
A: onap-discuss@...; Calamita Agostino <agostino.calamita@...>
Oggetto: RE: [onap-discuss] dmaap-message-router NodePort not recheable

 

Try a POST to make sure you can write to message router.

I doubt its connectivity.

 

If you are on master branch – try ./ete-k8s.sh onap healthmr to test a write/read to a test topic.

 

(do it twice since the first time it creates a test topic and kafka doesnt forward the message till both the publisher and the subscriber have connected)

 

Brian

 

 

From: onap-discuss@... <onap-discuss@...> On Behalf Of Calamita Agostino
Sent: Friday, March 01, 2019 4:30 AM
To: onap-discuss@...
Subject: R: [onap-discuss] dmaap-message-router NodePort not recheable

 

I tried to execute a wget command from sdc-be POD to message-router REST API and I see that dmaap-message-router is reacheable from sdc-be.

 

This is the result:

 

# kubectl exec -it  dev-sdc-sdc-be-656bd64b9b-jh57x  -n onap -- /bin/bash

 

bash-4.4# wget "http://message-router:3904/topics"

Connecting to message-router:3904 (10.43.1.20:3904)

topics               100% |*******************************|   131   0:00:00 ETA

bash-4.4# cat topics

{"topics": [

    "__consumer_offsets",

    "champRawEvents",

    "SDC-DISTR-NOTIF-TOPIC-AUTO",

    "org.onap.dmaap.mr.PNF_READY"

]}bash-4.4#

 

But audit.log of sdc-be, after “Distribution Service” action from Portal , says:

 

2019-03-01T08:32:07.986Z        [qtp215145189-323354]   INFO    o.o.sdc.be.filters.BeServletFilter     

ResponseCode=500        InstanceUUID=null       RequestId=d2f65e19-b07b-4266-8be2-f170aba42fb1  AlertSeverity=0 ElapsedTime=3  

EndTimestamp=2019-03-01 08:32:07.986Z   PartnerName=op0001      auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      

StatusCode=ERROR        TargetEntity=Distribution Engine is DOWN       

CustomField1=POST: http://sdc-be.onap:8080/sdc2/rest/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate       

timer=3 CustomField2=500        AuditBeginTimestamp=2019-03-01 08:32:07.983Z    RemoteHost=10.42.194.84 ErrorCategory=ERROR    

ServerIPAddress=10.42.179.134   ServiceName=/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate      

ServiceInstanceId=null   ClassName=org.openecomp.sdc.be.filters.BeServletFilter  ResponseDescription=Internal Server Error      

ErrorCode=500   null

 

 

In the same log file I found a lot of messages like this one:

 

2019-03-01T09:21:31.850Z        [qtp215145189-399996]   INFO    o.o.sdc.be.filters.BeServletFilter      AuditMessage=ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "aai" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"  ResponseCode=500        InstanceUUID=aai-ml     RequestId=7f01a5b2-ee38-42c9-b7a4-330f020a4134 AlertSeverity=0  ElapsedTime=169 EndTimestamp=2019-03-01 09:21:31.850Z   PartnerName=Apache-HttpClient/4.5.6 (Java/1.8.0_171)    auditOn=true    ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      StatusCode=ERROR        TargetEntity=registerInDistributionEngine       CustomField1=POST: https://sdc-be.onap:8443/sdc/v1/registerForDistribution      timer=169       CustomField2=500        AuditBeginTimestamp=2019-03-01 09:21:31.681Z    RemoteHost=10.42.209.109        ErrorCategory=ERROR     ServerIPAddress=10.42.179.134   ServiceName=/v1/registerForDistribution ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter  ResponseDescription=Internal Server Error       ErrorCode=500   ACTION = "HttpAuthentication" URL = "v1/registerForDistribution" USER = "aai" AUTH_STATUS = "AUTH_SUCCESS" REALM = "ASDC"

 

Thanks.

 

Da: onap-discuss@... [mailto:onap-discuss@...] Per conto di Calamita Agostino
Inviato: giovedì 28 febbraio 2019 16:13
A: onap-discuss@...
Oggetto: [onap-discuss] dmaap-message-router NodePort not recheable

 

Hi all,

I have an issue releted to connectivity from sdc-be pod and dmaap-message-router.

My installation is Casablanca 3.0.0 on 7 kubernetes VM cluster.

 

All dmaap pods are up and running:

 

dev-dmaap-dbc-pg-0                                            1/1       Running            0          1d        10.42.173.158   onapkm5   <none>

dev-dmaap-dbc-pg-1                                            1/1       Running            0          1d        10.42.188.140   onapkm2   <none>

dev-dmaap-dbc-pgpool-7b748d5894-mr2m9                         1/1       Running            0          1d        10.42.237.193   onapkm3   <none>

dev-dmaap-dbc-pgpool-7b748d5894-n6dks                         1/1       Running            0          1d        10.42.192.244   onapkm2   <none>

dev-dmaap-dmaap-bus-controller-6757c4c86-8rq5p                1/1       Running            0          1d        10.42.185.132   onapkm1   <none>

dev-dmaap-dmaap-dr-db-bb4c67cfd-tm7td                         1/1       Running            0          1d        10.42.152.59    onapkm1   <none>

dev-dmaap-dmaap-dr-node-66c8749959-tpdtf                      1/1       Running            0          1d        10.42.216.13    onapkm2   <none>

dev-dmaap-dmaap-dr-prov-5c766b8d69-qzqn2                      1/1       Running            0          1d        10.42.115.247   onapkm6   <none>

dev-dmaap-message-router-fb9f4bc7d-5z52j                      1/1       Running            0          6h        10.42.138.31    onapkm3   <none>

dev-dmaap-message-router-kafka-5fbc897f48-4bpb6               1/1       Running            0          1d        10.42.78.141    onapkm4   <none>

dev-dmaap-message-router-zookeeper-557954854-8d6p9            1/1       Running            0          1d        10.42.169.205   onapkm1   <none>

 

but when I try to distribute a service, from SDC Portal, I got “Internal Server Error”.

 

SDC-BE log file traces:

 

2019-02-28T08:50:35.318Z        [qtp215145189-159837]   INFO    o.o.sdc.be.filters.BeServletFilter      ResponseCode=500       

InstanceUUID=null RequestId=dab0fd50-b06e-4a65-b4a8-7d7edeae3e01   AlertSeverity=0 ElapsedTime=99  EndTimestamp=2019-02-28 08:50:35.318Z PartnerName=op0001      auditOn=true       ServerFQDN=dev-sdc-sdc-be-656bd64b9b-jh57x      StatusCode=ERROR       

TargetEntity=Distribution Engine is DOWN       

CustomField1=POST: http://sdc-be.onap:8080/sdc2/rest/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate  

timer=99        CustomField2=500   AuditBeginTimestamp=2019-02-28 08:50:35.219Z    RemoteHost=10.42.194.84 ErrorCategory=ERROR    

ServerIPAddress=10.42.179.134   ServiceName=/v1/catalog/services/02e0c5a4-be65-4d09-9f1e-49a2dab0f865/distribution/PROD/activate  

ServiceInstanceId=null  ClassName=org.openecomp.sdc.be.filters.BeServletFilter     ResponseDescription=Internal Server Error      

ErrorCode=500   null

 

Also SDC healthcheck reports that U-EB Cluster is DOWN.

 

Inside SDC-BE POD, I tried to make a traceroute to “message-router-zookeeper” and to “message-router”.

 

This is the result ( the first is OK, the second one NOT OK ):

 

bash-4.4# traceroute  message-router-zookeeper

traceroute to message-router-zookeeper (10.42.169.205), 30 hops max, 46 byte packets

1  10.42.7.46 (10.42.7.46)  0.213 ms  0.005 ms  0.005 ms

2  10.42.190.179 (10.42.190.179)  0.194 ms  0.145 ms  0.135 ms

3  10.42.169.205 (10.42.169.205)  0.461 ms  0.160 ms  0.134 ms

 

bash-4.4# traceroute  message-router

traceroute to message-router (10.43.1.20), 30 hops max, 46 byte packets

1  10.42.0.1 (10.42.0.1)  0.009 ms  0.005 ms  0.005 ms

2  itpat1ng505.palermo.italtel.it (138.132.168.173)  0.344 ms  2.211 ms  1.910 ms     ß 138.132.168.X  is VM public network

 3  138.132.169.2 (138.132.169.2)  5.063 ms  3.859 ms  3.934 ms

4  *  *  *

5  *  *  *

6  *  *  *

 

traceroute to message-router-kafka (10.43.148.154), 30 hops max, 46 byte packets

1  10.42.0.1 (10.42.0.1)  0.006 ms  0.005 ms  0.004 ms

2  itpat1ng505.palermo.italtel.it (138.132.168.173)  0.391 ms  0.337 ms  0.314 ms

3  138.132.169.2 (138.132.169.2)  0.803 ms  0.748 ms  0.807 ms

4  *  *  *

5  *  *  *

6  *  *  *

 

It seems that I cannot reach NodePort or ClusterIP inside a POD. This is routing table inside POD:

 

bash-4.4# netstat -rn

Kernel IP routing table

Destination     Gateway         Genmask         Flags   MSS Window  irtt Iface

0.0.0.0         10.42.0.1       0.0.0.0         UG        0 0          0 eth0

10.42.0.0       0.0.0.0         255.255.0.0     U         0 0          0 eth0

 

What can I check on Kubernetes Cluster ?

 

Thanks.

Agostino.

 

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

Internet Email Confidentiality Footer ** La presente comunicazione, con le informazioni in essa contenute e ogni documento o file allegato, e' rivolta unicamente alla/e persona/e cui e' indirizzata ed alle altre da questa autorizzata/e a riceverla. Se non siete i destinatari/autorizzati siete avvisati che qualsiasi azione, copia, comunicazione, divulgazione o simili basate sul contenuto di tali informazioni e' vietata e potrebbe essere contro la legge vigente (ad es. art. 616 C.P., D.Lgs n. 196/2003 Codice Privacy, Regolamento Europeo n. 679/2016/GDPR). Se avete ricevuto questa comunicazione per errore, vi preghiamo di darne immediata notizia al mittente e di distruggere il messaggio originale e ogni file allegato senza farne copia alcuna o riprodurne in alcun modo il contenuto. Al link seguente e' disponibile l'informativa Privacy: http://www.italtel.com/it/about/privacy/ ** This e-mail and its attachments are intended for the addressee(s) only and are confidential and/or may contain legally privileged information. If you have received this message by mistake or are not one of the addressees above, you may take no action based on it, and you may not copy or show it to anyone; please reply to this e-mail and point out the error which has occurred. Click here to read your privacy notice: http://www.italtel.com/it/about/privacy/

7501 - 7520 of 23354