Integrated services that combine source hosting with build pipelines, automated testing, deployment workflows, and monitoring to streamline software delivery.
GitHub and GitLab together dominate AI responses for Managed DevOps and CI/CD Platforms. Both brands consistently surface unprompted, with the model treating them as the default answers for most category queries. Brands outside the top two face a structural disadvantage: users are usually given these two before the model even considers alternatives.
Ranked by overall AI Visibility Score (geometric mean of LBA, Authority, and TOM). Click any brand for the full report.
| # | Brand | LBA | Authority | TOM | Overall |
|---|---|---|---|---|---|
| 1 |
GitHub
github.com
|
100 | 97 | 100 | 99 |
| 2 |
GitLab
gitlab.com
|
95 | 99 | 97 | 97 |
| 3 |
Azure DevOps
azure.com
|
88 | 34 | 77 | 61 |
| 4 |
CircleCI
circleci.com
|
91 | 26 | 78 | 57 |
| 5 |
Jenkins
jenkins.io
|
74 | 16 | 33 | 34 |
| 6 |
Bitbucket
bitbucket.org
|
77 | 3 | 56 | 23 |
| 7 |
Harness
harness.io
|
88 | 5 | 19 | 21 |
| 8 |
Terraform
hashicorp.com
|
80 | 7 | 8 | 16 |
| 9 |
Argo CD
argoproj.io
|
86 | 5 | 7 | 15 |
| 10 |
Buildkite
buildkite.com
|
76 | 4 | 13 | 15 |
| 11 |
Microsoft
microsoft.com
|
100 | 3 | 1 | 6 |
| 12 |
Octopus Deploy
octopus.com
|
80 | 0 | 5 | 5 |
| 13 |
Google
google.com
|
88 | 0 | 20 | 3 |
| 14 |
Atlassian
atlassian.com
|
79 | 0 | 8 | 2 |
| 15 |
Render
render.com
|
76 | 0 | 15 | 2 |
| 16 |
Vercel
vercel.com
|
85 | 0 | 11 | 2 |
| 17 |
Buddy
buddy.works
|
51 | 0 | 1 | 1 |
| 18 |
CloudBees
cloudbees.com
|
72 | 0 | 1 | 1 |
| 19 |
Semaphore
semaphoreci.com
|
75 | 0 | 1 | 1 |
| 20 |
TeamCity
teamcity.com
|
87 | 0 | 1 | 1 |
| 21 |
Travis CI
travisci.com
|
69 | 0 | 1 | 1 |
| 22 |
Amazon
amazon.com
|
83 | 0 | 0 | 0 |
| 23 |
Appcircle
appcircle.io
|
43 | 0 | 0 | 0 |
| 24 |
Assembla
assembla.com
|
42 | 0 | 0 | 0 |
| 25 |
Beanstalk
beanstalkapp.com
|
57 | 0 | 0 | 0 |
| 26 |
Codeberg
codeberg.org
|
42 | 0 | 0 | 0 |
| 27 |
Codefresh
codefresh.io
|
76 | 0 | 0 | 0 |
| 28 |
Codemagic
codemagic.io
|
73 | 0 | 0 | 0 |
| 29 |
Concourse
concourse-ci.org
|
55 | 0 | 0 | 0 |
| 30 |
Drone
drone.io
|
54 | 0 | 0 | 0 |
| 31 |
Gitea
gitea.com
|
64 | 0 | 0 | 0 |
| 32 |
GoCD
gocd.org
|
74 | 0 | 0 | 0 |
| 33 |
Nevercode
nevercode.io
|
50 | 0 | 0 | 0 |
| 34 |
RhodeCode
rhodecode.com
|
42 | 0 | 0 | 0 |
| 35 |
SourceHut
sr.ht
|
66 | 0 | 0 | 0 |
| 36 |
Upsource
jetbrains.com
|
12 | 0 | 0 | 0 |
| 37 |
Woodpecker
woodpecker.co
|
37 | 0 | 0 | 0 |
Every brand in this leaderboard is scored against the same set of 287 shared Managed DevOps and CI/CD Platforms prompts. The same prompts, same model, same iterations. So differences in scores reflect actual differences in AI visibility, not differences in measurement.
(LBA × Authority × TOM)^(1/3). Geometric mean is used so that any single weak metric pulls the overall score down, rather than being masked by strength elsewhere.
quality × meta × stability × share × recognition × 100. Read the full LBA methodology →