Skip to content

FSM Upgrade Steps

1. FSM Pre-Upgrade Steps

  • Add a kafka-v2 credential with name kafka-platform by providing the platform Kafka details from network-config

    /opt/rdaf/config/network_config/config.json from CLI VM
    

a) bootstrap.servers

b) Use <tenant-id>.user for SASL username and <tenant-id>.secret for SASL password

c) Copy SSL CA certificate PEM from cert file

d) Verify credential and check for status OK

  • Create and publish pipeline fsm-ha-migration. Update tenant ID in the pipeline. This will start reading messages from old Kafka topic and publish to new topic
--> @c:flex-block  
    --> @dm:empty    
    --> @exec:define-function name= "publish_to_kafka"
        --> @kafka-platform:write-stream name="<tenant-id>.datanetwork.fsm-events" and partition_key="{{row.partitionKey}}"
    --> @exec:end-function  

--> @c:new-block
    --> @kafka-platform:read-stream name="<tenant-id>.fsm-events" and group="grp-fsmevents-<tenant-id>" and offset_reset="earliest" and batch_size=1
        --> @exec:call-function name="publish_to_kafka"

Create Service BP for the above pipeline

name: fsm-ha-migration
id: fsm-ha-migration
version: '2023_11_16_01'
category: ITSM
comment: fsm-ha-migration
enabled: false
auto_deploy: true
type: Service
provider: CloudFabrix Software, Inc.
service_pipelines:
    -   name: fsm-ha-migration
        label: fsm-ha-migration
        version: '*'
        site: .*
        site_type: regex
        instances: 1
        scaling_policy:
            min_instances: 1
            max_instances: 1
  • Create and publish pipeline fsm-internal-events-migration .Update tenant ID in the pipeline. Create Service BP.
--> @c:flex-block  
    --> @dm:empty    
    --> @exec:define-function name= "publish_to_kafka"
        --> @kafka-platform:write-stream name="<tenant-id>.datanetwork.fsm-oia-external-ticket-inputs" and partition_key="{{row.partitionKey}}"
    --> @exec:end-function  

--> @c:new-block
    --> @kafka-platform:read-stream name="<tenant-id>.fsm-oia-external-ticket-inputs" and group="fsm-oia-external-ticket-inputs-create-ticket-group" and offset_reset="earliest" and batch_size=1
        --> @exec:call-function name="publish_to_kafka”

Create Service BP for the above pipeline

name: fsm-internal-events-migration
id: fsm-internal-events-migration
version: '2023_11_16_02'
category: ITSM
comment: fsm-internal-events-migration
enabled: false
auto_deploy: true
type: Service
provider: CloudFabrix Software, Inc.
service_pipelines:
    -   name: fsm-internal-events-migration
        label: fsm-internal-events-migration
        version: '*'
        site: .*
        site_type: regex
        instances: 1
        scaling_policy:
            min_instances: 1
            max_instances: 1
  • Truncate and Drop timer table.

    Log into the database, connect to <tenentId>_fsm schema and execute following commands given below.

    truncate table timer
    
    drop table timer
    

  • Disable Create Ticket, Update Ticket and Resolve Ticket Blueprints from service blueprints

  • Enable fsm-ha-migration blueprint

  • Enable fsm-internal-events-migration blueprint

2. FSM Post-Upgrade Steps

  • Deploy Bundles

    a) oia_fsm_common_ticketing_bundle

    b) oia_fsm_aots_ticketing_bundle

  • Wait for few minutes for the fsm-ha-migration, fsm-internal-events-migration to consume all the messages

  • Ensure that all events are read from old topic and there is no lag.Delete old Kafka topic

  • Disable fsm-ha-migration Blueprint. Delete the BP and pipeline.

  • Disable fsm-internal-events-migration Blueprint. Delete the BP and pipeline

  • Enable Create Ticket, Update Ticket, Resolve Ticket bundles

  • Update retention days for the following pstreams

    - fsm-debug-outbound-ticketing - 15 days

    - aots_ticket_notifications - 180 days

    - rda_fsm_event_stream - 180 days

    - fsm-oia-external-ticket-inputs - 180 days

    - fsm-transitions - 180 days

    - aots_ticketing_status - 180 days

    - fsm-fired-events - 180 days