Cloud-hosted and self-managed platforms that automate building, testing, and deployment pipelines for software projects.
GitHub and GitLab together dominate AI responses for CI/CD Platforms. Both brands consistently surface unprompted, with the model treating them as the default answers for most category queries. Brands outside the top two face a structural disadvantage: users are usually given these two before the model even considers alternatives.
Ranked by overall AI Visibility Score (geometric mean of LBA, Authority, and TOM). Click any brand for the full report.
| # | Brand | LBA | Authority | TOM | Overall |
|---|---|---|---|---|---|
| 1 |
GitHub
github.com
|
97 | 100 | 100 | 99 |
| 2 |
GitLab
gitlab.com
|
92 | 96 | 98 | 95 |
| 3 |
CircleCI
circleci.com
|
91 | 62 | 83 | 78 |
| 4 |
Jenkins
jenkins.io
|
71 | 43 | 75 | 61 |
| 5 |
Buildkite
buildkite.com
|
75 | 13 | 42 | 35 |
| 6 |
Argo CD
argoproj.io
|
85 | 13 | 26 | 30 |
| 7 |
Harness
harness.io
|
85 | 4 | 9 | 15 |
| 8 |
Tekton
tekton.dev
|
77 | 3 | 9 | 12 |
| 9 |
Flux CD
fluxcd.io
|
80 | 0 | 5 | 2 |
| 10 |
Google
google.com
|
76 | 0 | 10 | 2 |
| 11 |
TeamCity
teamcity.com
|
86 | 0 | 5 | 2 |
| 12 |
Atlassian
atlassian.com
|
76 | 0 | 2 | 1 |
| 13 |
Bitrise
bitrise.io
|
77 | 1 | 0 | 1 |
| 14 |
Buddy
buddy.works
|
57 | 0 | 1 | 1 |
| 15 |
CloudBees
cloudbees.com
|
72 | 0 | 1 | 1 |
| 16 |
Concourse
concourse-ci.org
|
63 | 0 | 1 | 1 |
| 17 |
Drone
drone.io
|
43 | 0 | 4 | 1 |
| 18 |
GoCD
gocd.org
|
71 | 0 | 1 | 1 |
| 19 |
Octopus Deploy
octopus.com
|
81 | 0 | 1 | 1 |
| 20 |
Semaphore
semaphoreci.com
|
74 | 0 | 1 | 1 |
| 21 |
Spinnaker
spinnaker.io
|
77 | 0 | 1 | 1 |
| 22 |
Travis CI
travisci.com
|
65 | 0 | 1 | 1 |
| 23 |
Woodpecker CI
woodpecker-ci.org
|
40 | 0 | 3 | 1 |
| 24 |
Amazon
amazon.com
|
78 | 0 | 0 | 0 |
| 25 |
Appcircle
appcircle.io
|
45 | 0 | 0 | 0 |
| 26 |
AppVeyor
appveyor.com
|
68 | 0 | 0 | 0 |
| 27 |
Buildbot
buildbot.net
|
67 | 0 | 0 | 0 |
| 28 |
Cirrus CI
cirrus-ci.org
|
59 | 0 | 0 | 0 |
| 29 |
Codefresh
codefresh.io
|
77 | 0 | 0 | 0 |
| 30 |
Codemagic
codemagic.io
|
71 | 0 | 0 | 0 |
| 31 |
DeployBot
deploybot.com
|
51 | 0 | 0 | 0 |
| 32 |
Microsoft
microsoft.com
|
97 | 0 | 0 | 0 |
| 33 |
Screwdriver
screwdriver.cd
|
55 | 0 | 0 | 0 |
| 34 |
Zuul
zuul-ci.org
|
54 | 0 | 0 | 0 |
Every brand in this leaderboard is scored against the same set of 269 shared CI/CD Platforms prompts. The same prompts, same model, same iterations. So differences in scores reflect actual differences in AI visibility, not differences in measurement.
(LBA × Authority × TOM)^(1/3). Geometric mean is used so that any single weak metric pulls the overall score down, rather than being masked by strength elsewhere.
quality × meta × stability × share × recognition × 100. Read the full LBA methodology →