2025: THE Year for GerritForge and Git in the AI-Driven SDLC

TL;DR

In 2025, GerritForge solidified its leadership in the Git ecosystem by securing TISAX certification and patenting our Git At High Speed (GHS) technology, proving our commitment to enterprise-grade security and performance. However, the rapid industry shift toward “Agentic Software Development” has created a critical challenge: current infrastructures are struggling to convert high-volume AI code generation into measurable business value, often leading to repository slowdowns and inflated costs rather than faster releases.

Our 2026 roadmap directly addresses this “ROI Gap” through a new “Assess, Measure, Improve” framework. We are launching GHS 2.0 to scale Git specifically for AI traffic, introducing server-side autonomous agents via the Gerrit Model Context Protocol (MCP), and deploying cross-platform metrics to monitor Git repository health in real-time. This strategy ensures your SDLC infrastructure not only withstands the load of AI agents but also integrates them securely to deliver the efficiency your investments demand.

2025 in Numbers

Our commitment to the open-source community and our customers is best reflected in the sheer volume of work our team has accomplished over the past 12 months.

Changes contributions to the Gerrit ecosystem

  • 748 commits across 43 projects, driving the Gerrit project forward.
  • 7 authors contributing consistently to core and plugin development.
  • 21 releases delivered, ensuring stability and new features for our users.
  • 14 talks given at international conferences, sharing our expertise with the global dev community.
  • 9 GerritMeets & Conferences sponsored or organized, fostering a vibrant local and global community.

GerritForgeTV: the live stream of the gerrit community

Our YouTube channel keeps being a central knowledge and stage for showcasing the most recent innovations for Gerrit administrators and developers. This year was no exception; we kept updating it with new and engaging content and keeping it relevant to the latest trends in the Git and VCS world.

  • 22 new videos published, staging key international speakers from some of the largest companies in the world, including Google, NVIDIA, Qualcomm and GitLab.
  • 126,483 impressions in the last 12 months.
  • 807 total watch hours, proving that the demand and interest for high-quality Git and Gerrit technical information and innovation is stronger than ever.

Major Successes & Milestones

GHS: From Vision to Patented Standard

In 2024, we announced Git At High Speed (GHS); one year later, we delivered on the initial promise of groundbreaking Git performance speedups and brought Git performance to the next level. We are proud to announce that GHS has now been officially submitted for US and EU patents. Furthermore, our commitment to scientific rigor led us to present the GHS Paper at ICSE 2025, where it was recognized by the global Computer Science Academic Community as a significant advancement in improving Git SCM performance.

Gerrit community Growth and stewarship

The GerritForge team remains central to the Gerrit project. This year, our team member Ponch was elected as a Maintainer, bringing the total to five GerritForge’s Gerrit Maintainers, on top of that Luca was re-elected to the engineering steering committee, and Dani to the community managers. GerritForge’s deep involvement with the Gerrit Community ensures that our customers’ needs are always represented at the core of the project’s development.

Security and Compliance: TISAX Certification

In 2025, we reached a significant milestone in enterprise trust by achieving TISAX certification, which is key for every software supplier to the modern Software-Defined Vehicles industry. For all industries, security and compliance are non-negotiable. Achieving TISAX certification represents our commitment to the highest levels of these standards.

Product Evolution: Gerrit BSL

For 17 years, GerritForge has operated on a 100% open source model. However, the landscape of software development is changing. Cyber threats and supply chain security compliance require a level of certification and long-term maintenance that the pure open-source model struggles to address on its own.

We introduced Gerrit Enterprise, a subscription carefully designed to shake up the Gerrit ecosystem:

  • The “Open-Core” Vision: We have separated the “Gerrit Core”—which remains 100% open source under Apache 2.0—from our high-performance enterprise plugins, which are released under BSL.
  • What is BSL? The Business Source License is a “source-available” model. It allows for public viewing and non-production use, but requires a license for commercial use.
  • Commitment to release as open source: An essential part of our BSL is the fact that after 5 years, any BSL-licensed code from GerritForge automatically converts to Apache 2.0. This ensures that while we fund today’s innovation, the community eventually benefits too.
  • Want to know more? Read the full announcement, which includes the list of plugins and projects released under BSL in 2025.

This move will provide a sustainable path to continue investing in the Gerrit core platform, its ecosystem, and the community events we all rely on to keep the project alive and thriving.

Community Events take center stage

The Gerrit User Summit 2025 has been one of the most successful events in the whole 17 years of the project’s history, thanks to the co-location with the OpenInfra Summit 2025 at the École Polytechnique of Paris and partnership with the OpenInfra Foundation. We saw fantastic participation from partners like the JJ community, GitButler, and GitLab, signaling a more integrated Git ecosystem.

We also had the most successful GerritMeets to date, dedicated to the Code Reviews in the Age of AI, hosted by Google in Munich, reaffirming Gerrit’s vital role in large-scale professional software development and its integration with the latest AI technologies to improve and accelerate the entire SDLC.

Paving the Way with AI

The future of Gerrit Code Review is happening now. In 2025, Gerrit Code Review v3.13 released a suite of new AI features and tools, including the brand-new MCP (Model Context Protocol) server open-sourced by Google. These tools are the foundation for a new way of interacting with code, paving the way for deeper integrations that make software development and code review faster and smarter.


Looking at 2026: the future is now

All the evolutions we saw across major industries in 2025 are reshaping the landscape of Git and the entire SDLC. The introduction of “Agentic Software Development” has created shockwaves across the industry and changed how we interact with and use these tools.

Using AI chats to vibe-code, cooperating and orchestrating AI Coding Agents, and generating and reviewing code automatically put tremendous strain on all the existing machinery that was never really designed to perform and scale at this rate.

All the major companies in the SDLC are looking at developing and leveraging LLMs in their products, adding the AI vibe to their product lines; however, there is a lot of work to do to make this generational transition to AI really work for everyone.

  • Productivity vs. Output Gap
    AI tools provide significant productivity gains in code generation, code review, debugging, and testing; these improvements do not always translate into faster release cycles.
  • Developers’ Productivity vs. Actual Changes Merged
    Engineers using AI tools are saving time in some of the repetitive coding tasks, such as prototyping and scaffolding. However, the generated code often gets stuck in the validation queue and does not become a valuable company asset until it is properly merged.
  • The Promise of High ROI from AI
    According to a recent Deloitte study, very few AI implementations across the entire SDLC deliver significant ROI. Progress is hard to measure, and the majority of organisations with substantial AI spending still fail to achieve tangible benefits.
    Only 6% of projects see returns in under 12 months, whilst most will take at least 2 years of continuous investment.
  • Agentic AI Future vs. Reality
    Agentic AI has been made possible by increased accessibility to the existing knowledge base and SDLC infrastructure by LLMs and promises a full end-to-end automation. However, only a small fraction of companies started using it, and of those, only 10% are currently realizing significant ROI.

GerritForge bridges the gaps between AI innovations and ROI

Our mission in 2026 is to help all organisations achieve the expected ROI from their AI investments by identifying and filling the gaps that hinder the success of their SDLC implementation.

The way forward is to engage with current and new customers and introduce new technologies and innovations into their existing infrastructure.

Assess, measure, improve, repeat.

We do believe that everyone can make progress and, at the same time, do more with the money they have invested in AI. We are truly believers that data is the only truth that can drive progress forward and on which everything should be based.

Our product plans for 2026 are all based on a smarter way to measure and improve:

  • Real-time metrics collection from Git repositories
    We will introduce a brand-new real-time data collector, based on the 3 years of R&D and investments made on repository performance, published at the ICSE 2025 in Ottawa (CA). The new component will be able to detect and advertise any repository slowdown caused by AI.
    The component is planned to support GitLab, GitHub Enterprise, Gitea, in addition to Gerrit Code Review, and is extendible to all Git-based SCM systems.

  • Native integration of Code Review experience with LLMs
    Gerrit Code Review v3.14, planned to be released in Spring 2026, will introduce the _”AI Chat”_ with any LLMs. GerritForge will take native support to a whole new level, enabling end-to-end communication and integration with Google Gemini, ChatGPT, and many other popular LLMs.
  • Agentic Gerrit Code Review
    Gerrit MCP, now Open Source, helps developers improve their client-side integration between Gerrit and LLMs using their own credentials. GerritForge will bring this paradigm to the server-side and enable real-life AI Agents to leverage Gerrit MCP for performing analysis and taking actions autonomously, without sacrificing confidentiality, security, and compliance.
  • Scale up and save money with Git, thanks to GHS 2.0
    GerritForge is bringing its GHS technology to a brand-new level, with all the experience and learnings in understanding the traffic generated by AI Agents. The new 2.0 will bring new modern actions and an improved learning model that will be able to react more accurately and bring system resources and costs into the ROI equation.

The future of Git, Gerrit Code Review, and the entire SDLC is now. AI has accelerated the race for innovation, adding speed to the competition. GerritForge is there with you, helping to endorse it and ensuring the whole pipeline will scale, and you can really achieve the ROI that makes your company stand against the competition.

Thank you to our team, our customers, and the incredible Gerrit community for making 2025 a year to remember. Let’s make 2026 THE ONE to remember as a turning point for the whole project, the Git ecosystem, and the community.

The GerritForge Team January 2026

How to enable Git v2 in Gerrit Code Review

git-2-26

(c) Shutterstock / spainter_vfx

Git protocol v2 landed in Gerrit 3.1 on the 11th of October 2019. This is the last email from David Ostrovsky concluding a thread of discussion about it:

It is done now. Git wire protocol v2 is a part of open source Gerrit and will be
shipped in upcoming Gerrit 3.1 release.

And, it is even enabled per default!

Huge thank to everyone who helped to make it a reality!

A big thanks to David and the whole community for the hard work in getting this done!

This was the 3rd attempt to get the feature in Gerrit after a couple of issues encountered along the path.

Why Git protocol v2?

The Git protocol v2 introduces a big optimization in the way client and server communicate during clones and fetches.

The big change has been the possibility of filtering server-side the refs not required by the client. In the previous version of the protocol, whenever a client was issuing a fetch, all the references were sent from the server to the client, even if the client was fetching a single ref!

In Gerrit this issue was even more evident, since, as you might know, Gerrit leverages a lot the refs for its internal functionality, even more with the introduction of NoteDb.

Whenever you are creating a Change in Gerrit you are updating/creating at least 3 refs:

  • refs/changes/NN/<change-num>/<patch-set>
  • refs/changes/NN/<change-num>/meta
  • refs/sequences/changes

In the Gerrit project itself, there are currently about 104K refs/change and 24K refs/change/*/meta. Imagine you are updating a repo which is behind just a couple of commits, you will get all those references which will take up most of your bandwidth.

Git protocol v2 will avoid this, just sending you back the references that the Git client requested.

Is it really faster?

Let’s see if it really does what is written on the tin. We have enabled Gerrit v2 at the end of 2019 on GerritHub.io, so let’s test it there. You will need a Git client from version 2.18 onwards.

> git clone "ssh://barbasa@review.gerrithub.io:29418/GerritCodeReview/gerrit"
> cd gerrit
> export GIT_TRACE_PACKET=1
> git -c protocol.version=2 fetch --no-tags origin master
19:16:34.583720 pkt-line.c:80           packet:        fetch< version 2
19:16:34.585050 pkt-line.c:80           packet:        fetch< ls-refs
19:16:34.585064 pkt-line.c:80           packet:        fetch< fetch=shallow
19:16:34.585076 pkt-line.c:80           packet:        fetch< server-option
19:16:34.585084 pkt-line.c:80           packet:        fetch< 0000
19:16:34.585094 pkt-line.c:80           packet:        fetch> command=ls-refs
19:16:34.585107 pkt-line.c:80           packet:        fetch> 0001
19:16:34.585116 pkt-line.c:80           packet:        fetch> peel
19:16:34.585124 pkt-line.c:80           packet:        fetch> symrefs
19:16:34.585133 pkt-line.c:80           packet:        fetch> ref-prefix master
19:16:34.585142 pkt-line.c:80           packet:        fetch> ref-prefix refs/master
19:16:34.585151 pkt-line.c:80           packet:        fetch> ref-prefix refs/tags/master
19:16:34.585160 pkt-line.c:80           packet:        fetch> ref-prefix refs/heads/master
19:16:34.585168 pkt-line.c:80           packet:        fetch> ref-prefix refs/remotes/master
19:16:34.585177 pkt-line.c:80           packet:        fetch> ref-prefix refs/remotes/master/HEAD
19:16:34.585186 pkt-line.c:80           packet:        fetch> 0000
19:16:35.052622 pkt-line.c:80           packet:        fetch< d21ee1980f6db7a0845e6f9732471909993a205c refs/heads/master
19:16:35.052687 pkt-line.c:80           packet:        fetch< 0000
From ssh://review.gerrithub.io:29418/GerritCodeReview/gerrit
 * branch                  master     -> FETCH_HEAD
19:16:35.175324 pkt-line.c:80           packet:        fetch> 0000

> git -c protocol.version=1 fetch --no-tags origin master
19:16:57.035135 pkt-line.c:80           packet:        fetch< d21ee1980f6db7a0845e6f9732471909993a205c HEAD\0 include-tag multi_ack_detailed multi_ack ofs-delta side-band side-band-64k thin-pack no-progress shallow agent=JGit/unknown symref=HEAD:refs/heads/master
19:16:57.037456 pkt-line.c:80           packet:        fetch< 07c8a169d6341c586a10163e895973f1bdccff92 refs/changes/00/100000/1
19:16:57.037489 pkt-line.c:80           packet:        fetch< 0014ca6443ac0af338e2677b45e538782bb7a12e refs/changes/00/100000/meta
19:16:57.037502 pkt-line.c:80           packet:        fetch< b4af8cad4d3982a0bba763a5e681d26078da5a0e refs/changes/00/100400/1
19:16:57.037513 pkt-line.c:80           packet:        fetch< 9ec6e507c493f4f1905cd090b47447e66b51b7e1 refs/changes/00/100400/meta
19:16:57.037523 pkt-line.c:80           packet:        fetch< a80359367529288eea3c283e7d542164bced1e2f refs/changes/00/100800/1
19:16:57.037533 pkt-line.c:80           packet:        fetch< 170cced6d81c25d1082d95e50b37883e113efd01 refs/changes/00/100800/meta
19:16:57.037544 pkt-line.c:80           packet:        fetch< 6cb616e0ad4b3274d4b728f8f7b641b6bd22dce4 refs/changes/00/100900/1
19:16:57.037554 pkt-line.c:80           packet:        fetch< 286d1ee1574127b76c4c1a6ef0f918ad4c61953a refs/changes/00/100900/meta
19:16:57.037606 pkt-line.c:80           packet:        fetch< 312ba566d2620b43fb90be3e7c406949edf6b6d9 refs/changes/00/10100/1
19:16:57.037619 pkt-line.c:80           packet:        fetch< dde4b73cb011178584aae4fb29a528018149d20b refs/changes/00/10100/meta

…. This will go on forever …. 

As you can see there is a massive difference in the data sent back on the wire!

How to enable it?

If you want to enable it, you just need to update you git config (etc/jgit.config in 3.1 and $HOME/.gitconfig in previous versions) with the protocol version to enable it and restart your server:

[protocol]
  version = 2

Enjoy your new blazing fast protocol!

If you are interested in more details about the Git v2 protocol you can find the specs here.

Fabio Ponciroli (GerritForge)
Gerrit Code Review Contributor

Jenkins ❤︎ Gerrit Code Review, again

Gerrit Code Review has been integrated with Jenkins for over nine years. It was back when Kohsuke was still a Senior Engineer at Sun Microsystem, which was just announced to be acquired by Oracle and his OpenSource CI project was still called Hudson.

Jenkins and Gerrit are the most critical components of the DevOps Pipeline because of their focus on people (the developers), their code and collaboration (the code review) their builds and tests (the Jenkinsfile pipeline) that produce the value stream to the end user.

The integration between code and build is so important that other solutions like GitLab have made it a unique integrated tool and even GitHub has started covering the “last mile” a few months ago by offering powerful actions APIs and workflow to automate build actions around the code collaboration.

Accelerate the CI/CD pipeline

DevOps is all about iteration and fast feedback. That can be achieved by automating the build and verification of the code changes into a target environment, by allowing all the stakeholder to have early access to what the feature will look like and validating the results with speed and quality at the same time.

Every development team wants to make the cycle time smaller and spend less time in tedious work by automating it as much as possible. That trend has created a new explosion of fully automated processes called “Bots” that are more and more responsible for performing those tasks that developers are not interested in doing manually over and over again.

As a result, developers are doing more creative and design work, are more motivated and productive, can address technical debt a lot sooner and allow the business to go faster in more innovative directions.

As more and more companies are adopting DevOps, it becomes more important to be better and faster than your competitors. The most effective way to accelerate is to extract your data, understand where your bottlenecks are, experiment changes and measure progress.

Humans vs. Bots

The Gerrit Code Review project is fully based on an automated DevOps pipeline using Jenkins. We collect the data produced during the development and testing of the platform and extract metrics and graphs around it constantly https://analytics.gerrithub.io thanks to the OpenSource solution Gerrit DevOps Analytics (aka GDA).

By looking at the protocol and code statistics, we founded out that bots are much more hard worker than humans on GerritHub.io, which hosts, apart from the Gerrit Code Review mirrored projects, also many other popular OpenSource.

That should not come as a surprise if you think of how many activities could potentially happen whenever a PatchSet is submitted in Gerrit: style checking, static code analysis, unit and integration testing, etc.

human-vs-bot

We also noticed that most of the activities of the bots are over SSH. We started to analyze what the Bots are doing and see what the impact is on our service and possibly see if there are any improvements we can do.

Build integration, the wrong way

GerritHub has an active site with multiple nodes serving read/write traffic and a disaster recovery site ready to take over whenever the active one has any problem.

Whenever we roll out a new version of Gerrit, using the so-called ping-pong technique, we swap the roles of the two sites (see here for more details).  Within the same site, also, the traffic can jump from one to the other in the same cluster using active failover, based on health, load and availability. The issue is that we end up in a situation like the following:

Basic Use Case Diagram

The “old” instance still served SSH traffic after the switch. We noticed we had loads of long-lived SSH connections. These are mostly integration tools keeping SSH connections open listening to Gerrit events.

Long-lived SSH connections have several issues:

  • SSH traffic doesn’t allow smart routing. Hence we end up with HTTP traffic going on the currently active node and most of the SSH one still on the old one
  • There is no resource pooling since the connections are not released
  • There is the potential loss of events when restarting the connections

That impacts the overall stability of the software delivery lifecycle, extending the feedback loop and slowing your DevOps pipeline down.

Then we started to drill down into the stateful connections to understand why they exist, where are they coming from and, most importantly, which part of the pipeline they belong to.

Jenkins Integration use-case

The Gerrit Trigger plugin for Jenkins is one of the integration tools that has historically been suffering from those problems, and unfortunately, the initial tight integration has become over the years less effective, slow and complex to use.

There are mainly two options to integrate Jenkins with Gerrit:

We use both of them with the Gerrit Code Review project, and we have put together a summary of how they compare to each other:

Gerrit Trigger Plugin Gerrit Code review Plugin Notes
Trigger mechanism Stateful

Jenkins listens for Gerrit events stream

Stateless

Gerrit webhooks notify events to Jenkins

Stateful stream events are consuming resources on both Jenkins and Gerrit
Transport Protocol SSH session on a long-lived stream events connection HTTP calls for each individual stream event – SSH cannot be load-balanced
– SSH connections cannot be pooled or reused
Setup Complexity Hard: requires a node-level and project-level configuration.

No native Jenkinsfile pipeline integration

Easy: no special knowledge required.

Integrates natively with Jenkinsfile and multi-branch pipeline

Configuring the Gerrit Trigger Plugin is more error-prone because requires a lot of parameters and settings.
Systems dependencies Tightly Coupled with Gerrit versions and plugins. Uses Gerrit as a generic Git server, loosely coupled. Upgrade of Gerrit might break the Gerrit Trigger Plugin integration.
Gerrit knowledge Admin: You need to know a lot of Gerrit-specific settings to integrate with Jenkins. User. You only need to know Gerrit clone URL and credentials. The Gerrit Trigger plugin requires special user and permissions to listen to Gerrit stream events.
Fault tolerance to Jenkins restart Missed events: unless you install a server-side DB to capture and replay the events. Transparent: all events are sent as soon as Jenkins is back. Gerrit webhook automatically tracks and retries events transparently.
Tolerance to Gerrit rolling restart Events stuck: Gerrit events are stuck until the connection is reset. Transparent: any of the Gerrit nodes active continue to send events. Gerrit trigger plugin is forced to terminate stream with a watchdog, but will still miss events.
Differentiate actions per stage No flexibility to tailor the Gerrit labels to each stage of the Jenkinsfile pipeline. Full availability to Gerrit labels and comments in the Jenkinsfile pipeline
Multi-branch support Custom: you need to use the Gerrit Trigger Plugin environment variables to checkout the right branch. Native: integrates with the multi-branch projects and Jenkinsfile pipelines, without having to setup anything special.

Gerrit and Jenkins friends again

After so many years of adoption, evolution and also struggles of using them together, finally Gerrit Code Review has the first-class integration with Jenkins, liberating the Development Team from the tedious configuration and BAU management of triggering a build from a change under review.

Jenkins users truly love using Gerrit and the other way around, friends and productive together, again.

Conclusion

Thanks to Gerrit DevOps Analytics (GDA) we managed to find one of the bottlenecks of the Gerrit DevOps Pipeline and making changes to make it faster, more effective and reliable than ever before.

In this case, by just picking the right Jenkins integration plugin, your Gerrit Code Review Master Server would run faster, with less resource utilization. Your Jenkins pipeline is going to be simpler and more reliable with the validation of each change under review, without delays or hiccups.

The Gerrit Code Review plugin for Jenkins is definitively the first-class integration to Gerrit. Give it a try yourself, you won’t believe how easy it is to set up.

Fabio Ponciroli
Gerrit Code Review Contributor, GerritForge.

Accelerate with Gerrit DevOps Analytics, in one click!

 

Accelerating your time to market while delivering high-quality products is vital for any company of any size. This fast pacing and always evolving world relies on getting quicker and better in the production pipeline of the products. The whole DevOps and Lean methodologies help to achieve the speed and quality needed by continuously improving the process in a so-called feedback loop. The faster the cycle, the quicker is the ability to achieve the competitive advantage to outperform and beat the competition.

It is fundamental to have a scientific approach and put metrics in place to measure and monitor the progress of the different actors in the whole software lifecycle and delivery pipeline.

Gerrit DevOps Analytics (GDA) to the rescue

We need data to build metrics to design our continuous improvement lifecycle around it. We need to juice information from all the components we use, directly or indirectly, on a daily basis:

  • SCM/VCS (Source and Configuration Management, Version Control System)
    how many commits are going through the pipeline?
  • Code Review
    what’s the lead time for a piece of code to get validated?
    How are people interacting and cooperating around the code?
  • Issue tracker (e.g. Jira)
    how long does it take the end-to-end lifecycle outside the development, from idea to production?

Getting logs from these sources and understanding what they are telling us is fundamental to anticipate delays in deliveries, evaluate the risk of a product release and make changes in the organization to accelerate the teams’ productivity. That is not an easy task.

Gerrit DevOps Analytics (aka GDA) is an OpenSource solution for collecting data, aggregating them based on different dimensions and expose meaningful metrics in a timely fashion.

GDA is part of the Gerrit Code Review ecosystem and has been presented during the last Gerrit User Summit 2018 at Cloudera HQ in Palo Alto. However, GDA is not limited to Gerrit and is aiming at integrating and processing any information coming from other version control and code-review systems, including GitLab, GitHub and BitBucket.

Case study: GDA applied to the Gerrit Code Review project

One of the golden rules of Lean and DevOps is continuous improvement: “eating your dog food” is the perfect way to measure the progress of the solution by using its outcome in our daily life of developing GDA.

As part of the Gerrit project, I have been working with GerritForge to create Open Source tools to develop the GDA dashboards. These are based on events coming from Gerrit and Git, but we also extract data coming from the CI system, the Issue tracker. These tools include the ETL, for the data extraction and the presentation of the data.

As you will see in the examples Gerrit is not just the code review tool itself, but also its plugins ecosystem, hence you might want to include them as well into any collection and processing of analytics data.

Wanna try GDA? You are just one click away.

We made the GDA more accessible to everybody, so more people can play with it and understand its potentials. We create the Gerrit Analytics Wizard plugin so you can have some insights in your data with just one click.

What you can do

With the Gerrit Analytics Wizard you can get started quickly and with only one click you can get:

  • Initial setup with an Analytics playground with some defaults charts
  • Populate the Dashboard with data coming from one or more projects of your choice

The full GDA experience

When using the full GDA experience, you have the full control of your data:

  • Schedule recurring data imports. It is just meant to run a one-off import of the data
  • Create a production ready environment. It is meant to build a playground to explore the potentials of GDA

What components are needed?

To run the Gerrit Analytics Wizard you need:

You can find here more detailed information about the installation.

One click to crunch loads of data

Once you have Gerrit and the GDA Analytics and Wizard plugins installed, chose the top menu item Analytics Wizard > Configure Dashboard.

You land on the Analytics Wizard and can configure the following parameters:

  • Dashboard name (mandatory): name of the dashboard to create
  • Projects prefix (optional): prefix of the projects to import, i.e.: “gerrit” will match all the projects that are starting with the prefix “gerrit”. NOTE: The prefix does not support wildcards or regular expressions.
  • Date time-frame (optional): date and time interval of the data to import. If not specified the whole history will be imported without restrictions of date or time.
  • Username/Password (optional): credentials for Gerrit API, if basic auth is needed to access the project’s data.

Sample dashboard analytics wizard page:

wizard.pngOnce you are done with the configuration, press the “Create Dashboard” button and wait for the Dashboard, tailored to your data, to be created (beware this operation will take a while since it requires to download several Docker images and run an ETL job to collect and aggregate the data).

At the end of the data crunching you will be presented with a Dashboard with some initial Analytics graphs like the one below:

dashboard-e1549490575330.png

You can now navigate among the different charts from different dimensions, through time, projects, people and Teams, uncovering the potentials of your data thanks to GDA!

What has just happened behind the scenes?

When you press the “Create Dashboard” button, loads of magic happens behind the scenes. Several Docker images will be downloaded to run an ElasticSearch and Kibana instance locally, to set up the Dashboard and run the ETL job to import the data. Here a sequence workflow to illustrate the chain of events is happening:

components.png

Conclusion

Getting insights into your data is so important and has never been so simple. GDA is an OpenSource and SaaS (Software as a Service) solution designed, implemented and operated by GerritForge. GDA allows setting up the extraction flows and gives you the “out-of-the-box” solution for accelerating your company’s business right now.

Contact us if you need any help with setting up a Data Analytics pipeline or if you have any feedback about Gerrit DevOps Analytics.

Fabio Ponciroli – Gerrit Code Review Contributor – GerritForge Ltd.