You can use the Developer perspective of the OpenShift Container Platform web console to create CI/CD pipelines for your software delivery process.
In the Developer perspective:
Use the Add → Pipeline → Pipeline builder option to create customized pipelines for your application.
Use the Add → From Git option to create pipelines using operator-installed pipeline templates and resources while creating an application on OpenShift Container Platform.
After you create the pipelines for your application, you can view and visually interact with the deployed pipelines in the Pipelines view. You can also use the Topology view to interact with the pipelines created using the From Git option. You must apply custom labels to pipelines created using the Pipeline builder to see them in the Topology view.
You have access to an OpenShift Container Platform cluster and have switched to the Developer perspective.
You have the OpenShift Pipelines Operator installed in your cluster.
You are a cluster administrator or a user with create and edit permissions.
You have created a project.
In the Developer perspective of the console, you can use the +Add → Pipeline → Pipeline builder option to:
Configure pipelines using either the Pipeline builder or the YAML view.
Construct a pipeline flow using existing tasks and cluster tasks. When you install the OpenShift Pipelines Operator, it adds reusable pipeline cluster tasks to your cluster.
Specify the type of resources required for the pipeline run, and if required, add additional parameters to the pipeline.
Reference these pipeline resources in each of the tasks in the pipeline as input and output resources.
If required, reference any additional parameters added to the pipeline in the task. The parameters for a task are prepopulated based on the specifications of the task.
Use the Operator-installed, reusable snippets and samples to create detailed pipelines.
In the +Add view of the Developer perspective, click the Pipeline tile to see the Pipeline builder page.
Configure the pipeline using either the Pipeline builder view or the YAML view.
The Pipeline builder view supports a limited number of fields whereas the YAML view supports all available fields. Optionally, you can also use the Operator-installed, reusable snippets and samples to create detailed Pipelines. |
Configure your pipeline by using Pipeline builder:
In the Name field, enter a unique name for the pipeline.
In the Tasks section:
Click Add task.
Search for a task using the quick search field and select the required task from the displayed list.
Click Add or Install and add. In this example, use the s2i-nodejs task.
The search list contains all the Tekton Hub tasks and tasks available in the cluster. Also, if a task is already installed it will show Add to add the task whereas it will show Install and add to install and add the task. It will show Update and add when you add the same task with an updated version. |
To add sequential tasks to the pipeline:
Click the plus icon to the right or left of the task → click Add task.
Search for a task using the quick search field and select the required task from the displayed list.
Click Add or Install and add.
To add a final task:
Click the Add finally task → Click Add task.
Search for a task using the quick search field and select the required task from the displayed list.
Click Add or Install and add.
In the Resources section, click Add Resources to specify the name and type of resources for the pipeline run. These resources are then used by the tasks in the pipeline as inputs and outputs. For this example:
Add an input resource. In the Name field, enter Source
, and then from the Resource Type drop-down list, select Git.
Add an output resource. In the Name field, enter Img
, and then from the Resource Type drop-down list, select Image.
A red icon appears next to the task if a resource is missing. |
Optional: The Parameters for a task are pre-populated based on the specifications of the task. If required, use the Add Parameters link in the Parameters section to add additional parameters.
In the Workspaces section, click Add workspace and enter a unique workspace name in the Name field. You can add multiple workspaces to the pipeline.
In the Tasks section, click the s2i-nodejs task to see the side panel with details for the task. In the task side panel, specify the resources and parameters for the s2i-nodejs task:
If required, in the Parameters section, add more parameters to the default ones, by using the $(params.<param-name>) syntax.
In the Image section, enter Img
as specified in the Resources section.
Select a workspace from the source drop-down under Workspaces section.
Add resources, parameters, and workspaces to the openshift-client task.
Click Create to create and view the pipeline in the Pipeline Details page.
Click the Actions drop-down menu then click Start, to see the Start Pipeline page.
The Workspaces section lists the workspaces you created earlier. Use the respective drop-down to specify the volume source for your workspace. You have the following options: Empty Directory, Config Map, secret, PersistentVolumeClaim, or VolumeClaimTemplate.
To create pipelines along with applications, use the From Git option in the Add view of the Developer perspective. For more information, see Creating applications using the Developer perspective.
The Pipelines view in the Developer perspective lists all the pipelines in a project, along with the following details:
The namespace in which the pipeline was created
The last pipeline run
The status of the tasks in the pipeline run
The status of the pipeline run
The creation time of the last pipeline run
In the Pipelines view of the Developer perspective, select a project from the Project drop-down list to see the pipelines in that project.
Click the required pipeline to see the Pipeline details page.
By default, the Details tab displays a visual representation of all the all the serial tasks, parallel tasks, finally
tasks, and when expressions in the pipeline. The tasks and the finally
tasks are listed in the lower right portion of the page. Click the listed Tasks and Finally tasks to view the task details.
Optional: On the Pipeline details page, click the Metrics tab to see the following information about pipelines:
Pipeline Success Ratio
Number of Pipeline Runs
Pipeline Run Duration
Task Run Duration
You can use this information to improve the pipeline workflow and eliminate issues early in the pipeline lifecycle.
Optional: Click the YAML tab to edit the YAML file for the pipeline.
Optional: Click the Pipeline Runs tab to see the completed, running, or failed runs for the pipeline.
The Pipeline Runs tab provides details about the pipeline run, the status of the task, and a link to debug failed pipeline runs. Use the Options menu to stop a running pipeline, to rerun a pipeline using the same parameters and resources as that of the previous pipeline execution, or to delete a pipeline run.
Click the required pipeline run to see the Pipeline Run details page. By default, the Details tab displays a visual representation of all the serial tasks, parallel tasks, finally
tasks, and when expressions in the pipeline run. The results for successful runs are displayed under the Pipeline Run results pane at the bottom of the page.
The Details section of the Pipeline Run Details page displays a Log Snippet of the failed pipeline run. Log Snippet provides a general error message and a snippet of the log. A link to the Logs section provides quick access to the details about the failed run. |
On the Pipeline Run details page, click the Task Runs tab to see the completed, running, and failed runs for the task.
The Task Runs tab provides information about the task run along with the links to its task and pod, and also the status and duration of the task run. Use the Options menu to delete a task run.
Click the required task run to see the Task Run details page. The results for successful runs are displayed under the Task Run results pane at the bottom of the page.
The Details section of the Task Run details page displays a Log Snippet of the failed task run. Log Snippet provides a general error message and a snippet of the log. A link to the Logs section provides quick access to the details about the failed task run. |
Click the Parameters tab to see the parameters defined in the pipeline. You can also add or edit additional parameters, as required.
Click the Resources tab to see the resources defined in the pipeline. You can also add or edit additional resources, as required.
As a cluster administrator, to create and deploy an application from a Git repository, you can use custom pipeline templates that override the default pipeline templates provided by Red Hat OpenShift Pipelines 1.5 and later.
This feature is unavailable in Red Hat OpenShift Pipelines 1.4 and earlier versions. |
Ensure that the Red Hat OpenShift Pipelines 1.5 or later is installed and available in all namespaces.
Log in to the OpenShift Container Platform web console as a cluster administrator.
In the Administrator perspective, use the left navigation panel to go to the Pipelines section.
From the Project drop-down, select the openshift project. This ensures that the subsequent steps are performed in the openshift
namespace.
From the list of available pipelines, select a pipeline that is appropriate for building and deploying your application. For example, if your application requires a node.js
runtime environment, select the s2i-nodejs pipeline.
Do not edit the default pipeline template. It may become incompatible with the UI and the back-end. |
Under the YAML tab of the selected pipeline, click Download and save the YAML file to your local machine. If your custom configuration file fails, you can use this copy to restore a working configuration.
Disable (delete) the default pipeline templates:
Use the left navigation panel to go to Operators → Installed Operators.
Click Red Hat OpenShift Pipelines → Tekton Configuration tab → config → YAML tab.
To disable (delete) the default pipeline templates in the openshift
namespace, set the pipelineTemplates
parameter to false
in the TektonConfig
custom resource YAML, and save it.
apiVersion: operator.tekton.dev/v1alpha1
kind: TektonConfig
metadata:
name: config
spec:
profile: all
targetNamespace: openshift-pipelines
addon:
params:
- name: clusterTasks
value: "true"
- name: pipelineTemplates
value: "false"
...
If you manually delete the default pipeline templates, the Operator restores the defaults during an upgrade. |
As a cluster admin, you can disable the installation of the default pipeline templates in the Operator configuration. However, such a configuration deletes all default pipeline templates, not just the one you want to customize. |
Create a custom pipeline template:
Use the left navigation panel to go to the Pipelines section.
From the Create drop-down, select Pipeline.
Create the required pipeline in the openshift
namespace. Give it a different name than the default one (for example, custom-nodejs
). You can use the downloaded default pipeline template as a starting point and customize it.
Because |
Under the Details tab of the created pipeline, ensure that the Labels in the custom template match the labels in the default pipeline. The custom pipeline template must have the correct labels for the runtime, type, and strategy of the application. For example, the required labels for a node.js
application deployed on OpenShift Container Platform are as follows:
...
pipeline.openshift.io/runtime: nodejs
pipeline.openshift.io/type: openshift
...
You can use only one pipeline template for each combination of runtime environment and deployment type. |
In the Developer perspective, use the +Add → Git Repository → From Git option to select the kind of application you want to create and deploy. Based on the required runtime and type of the application, your custom template is automatically selected.
After you create a pipeline, you need to start it to execute the included tasks in the defined sequence. You can start a pipeline from the Pipelines view, the Pipeline Details page, or the Topology view.
To start a pipeline using the Pipelines view:
In the Pipelines view of the Developer perspective, click the Options menu adjoining a pipeline, and select Start.
The Start Pipeline dialog box displays the Git Resources and the Image Resources based on the pipeline definition.
For pipelines created using the From Git option, the Start Pipeline dialog box also displays an |
If you have resources in your namespace, the Git Resources and the Image Resources fields are prepopulated with those resources. If required, use the drop-downs to select or create the required resources and customize the pipeline run instance.
Optional: Modify the Advanced Options to add the credentials that authenticate the specified private Git server or the image registry.
Under Advanced Options, click Show Credentials Options and select Add secret.
In the Create Source secret section, specify the following:
A unique secret Name for the secret.
In the Designated provider to be authenticated section, specify the provider to be authenticated in the Access to field, and the base Server URL.
Select the Authentication Type and provide the credentials:
For the Authentication Type Image Registry Credentials
, specify the Registry Server Address that you want to authenticate, and provide your credentials in the Username, Password, and Email fields.
Select Add Credentials if you want to specify an additional Registry Server Address.
For the Authentication Type Basic Authentication
, specify the values for the UserName and Password or Token fields.
For the Authentication Type SSH Keys
, specify the value of the SSH Private Key field.
For basic authentication and SSH authentication, you can use annotations such as:
|
Select the check mark to add the secret.
You can add multiple secrets based upon the number of resources in your pipeline.
Click Start to start the pipeline.
The Pipeline Run Details page displays the pipeline being executed. After the pipeline starts, the tasks and steps within each task are executed. You can:
Hover over the tasks to see the time taken to execute each step.
Click on a task to see the logs for each step in the task.
Click the Logs tab to see the logs relating to the execution sequence of the tasks. You can also expand the pane and download the logs individually or in bulk, by using the relevant button.
Click the Events tab to see the stream of events generated by a pipeline run.
You can use the Task Runs, Logs, and Events tabs to assist in debugging a failed pipeline run or a failed task run.
For pipelines created using the From Git option, you can use the Topology view to interact with pipelines after you start them:
To see the pipelines created using the Pipeline Builder in the Topology view, customize the pipeline labels to link the pipeline with the application workload. |
On the left navigation panel, click Topology, and click on the application to see the pipeline runs listed in the side panel.
In the Pipeline Runs section, click Start Last Run to start a new pipeline run with the same parameters and resources as the previous one. This option is disabled if a pipeline run has not been initiated.
In the Topology page, hover to the left of the application to see the status of the pipeline run for the application.
The side panel of the application node in the Topology page displays a Log Snippet when a pipeline run fails on a specific task run. You can view the Log Snippet in the Pipeline Runs section, under the Resources tab. Log Snippet provides a general error message and a snippet of the log. A link to the Logs section provides quick access to the details about the failed run. |
You can edit the Pipelines in your cluster using the Developer perspective of the web console:
In the Pipelines view of the Developer perspective, select the Pipeline you want to edit to see the details of the Pipeline. In the Pipeline Details page, click Actions and select Edit Pipeline.
On the Pipeline builder page, you can perform the following tasks:
Add additional Tasks, parameters, or resources to the Pipeline.
Click the Task you want to modify to see the Task details in the side panel and modify the required Task details, such as the display name, parameters, and resources.
Alternatively, to delete the Task, click the Task, and in the side panel, click Actions and select Remove Task.
Click Save to save the modified Pipeline.
You can delete the Pipelines in your cluster using the Developer perspective of the web console.
In the Pipelines view of the Developer perspective, click the Options menu adjoining a Pipeline, and select Delete Pipeline.
In the Delete Pipeline confirmation prompt, click Delete to confirm the deletion.