2025: THE Year for GerritForge and Git in the AI-Driven SDLC

TL;DR

In 2025, GerritForge solidified its leadership in the Git ecosystem by securing TISAX certification and patenting our Git At High Speed (GHS) technology, proving our commitment to enterprise-grade security and performance. However, the rapid industry shift toward “Agentic Software Development” has created a critical challenge: current infrastructures are struggling to convert high-volume AI code generation into measurable business value, often leading to repository slowdowns and inflated costs rather than faster releases.

Our 2026 roadmap directly addresses this “ROI Gap” through a new “Assess, Measure, Improve” framework. We are launching GHS 2.0 to scale Git specifically for AI traffic, introducing server-side autonomous agents via the Gerrit Model Context Protocol (MCP), and deploying cross-platform metrics to monitor Git repository health in real-time. This strategy ensures your SDLC infrastructure not only withstands the load of AI agents but also integrates them securely to deliver the efficiency your investments demand.

2025 in Numbers

Our commitment to the open-source community and our customers is best reflected in the sheer volume of work our team has accomplished over the past 12 months.

Changes contributions to the Gerrit ecosystem

  • 748 commits across 43 projects, driving the Gerrit project forward.
  • 7 authors contributing consistently to core and plugin development.
  • 21 releases delivered, ensuring stability and new features for our users.
  • 14 talks given at international conferences, sharing our expertise with the global dev community.
  • 9 GerritMeets & Conferences sponsored or organized, fostering a vibrant local and global community.

GerritForgeTV: the live stream of the gerrit community

Our YouTube channel keeps being a central knowledge and stage for showcasing the most recent innovations for Gerrit administrators and developers. This year was no exception; we kept updating it with new and engaging content and keeping it relevant to the latest trends in the Git and VCS world.

  • 22 new videos published, staging key international speakers from some of the largest companies in the world, including Google, NVIDIA, Qualcomm and GitLab.
  • 126,483 impressions in the last 12 months.
  • 807 total watch hours, proving that the demand and interest for high-quality Git and Gerrit technical information and innovation is stronger than ever.

Major Successes & Milestones

GHS: From Vision to Patented Standard

In 2024, we announced Git At High Speed (GHS); one year later, we delivered on the initial promise of groundbreaking Git performance speedups and brought Git performance to the next level. We are proud to announce that GHS has now been officially submitted for US and EU patents. Furthermore, our commitment to scientific rigor led us to present the GHS Paper at ICSE 2025, where it was recognized by the global Computer Science Academic Community as a significant advancement in improving Git SCM performance.

Gerrit community Growth and stewarship

The GerritForge team remains central to the Gerrit project. This year, our team member Ponch was elected as a Maintainer, bringing the total to five GerritForge’s Gerrit Maintainers, on top of that Luca was re-elected to the engineering steering committee, and Dani to the community managers. GerritForge’s deep involvement with the Gerrit Community ensures that our customers’ needs are always represented at the core of the project’s development.

Security and Compliance: TISAX Certification

In 2025, we reached a significant milestone in enterprise trust by achieving TISAX certification, which is key for every software supplier to the modern Software-Defined Vehicles industry. For all industries, security and compliance are non-negotiable. Achieving TISAX certification represents our commitment to the highest levels of these standards.

Product Evolution: Gerrit BSL

For 17 years, GerritForge has operated on a 100% open source model. However, the landscape of software development is changing. Cyber threats and supply chain security compliance require a level of certification and long-term maintenance that the pure open-source model struggles to address on its own.

We introduced Gerrit Enterprise, a subscription carefully designed to shake up the Gerrit ecosystem:

  • The “Open-Core” Vision: We have separated the “Gerrit Core”—which remains 100% open source under Apache 2.0—from our high-performance enterprise plugins, which are released under BSL.
  • What is BSL? The Business Source License is a “source-available” model. It allows for public viewing and non-production use, but requires a license for commercial use.
  • Commitment to release as open source: An essential part of our BSL is the fact that after 5 years, any BSL-licensed code from GerritForge automatically converts to Apache 2.0. This ensures that while we fund today’s innovation, the community eventually benefits too.
  • Want to know more? Read the full announcement, which includes the list of plugins and projects released under BSL in 2025.

This move will provide a sustainable path to continue investing in the Gerrit core platform, its ecosystem, and the community events we all rely on to keep the project alive and thriving.

Community Events take center stage

The Gerrit User Summit 2025 has been one of the most successful events in the whole 17 years of the project’s history, thanks to the co-location with the OpenInfra Summit 2025 at the École Polytechnique of Paris and partnership with the OpenInfra Foundation. We saw fantastic participation from partners like the JJ community, GitButler, and GitLab, signaling a more integrated Git ecosystem.

We also had the most successful GerritMeets to date, dedicated to the Code Reviews in the Age of AI, hosted by Google in Munich, reaffirming Gerrit’s vital role in large-scale professional software development and its integration with the latest AI technologies to improve and accelerate the entire SDLC.

Paving the Way with AI

The future of Gerrit Code Review is happening now. In 2025, Gerrit Code Review v3.13 released a suite of new AI features and tools, including the brand-new MCP (Model Context Protocol) server open-sourced by Google. These tools are the foundation for a new way of interacting with code, paving the way for deeper integrations that make software development and code review faster and smarter.


Looking at 2026: the future is now

All the evolutions we saw across major industries in 2025 are reshaping the landscape of Git and the entire SDLC. The introduction of “Agentic Software Development” has created shockwaves across the industry and changed how we interact with and use these tools.

Using AI chats to vibe-code, cooperating and orchestrating AI Coding Agents, and generating and reviewing code automatically put tremendous strain on all the existing machinery that was never really designed to perform and scale at this rate.

All the major companies in the SDLC are looking at developing and leveraging LLMs in their products, adding the AI vibe to their product lines; however, there is a lot of work to do to make this generational transition to AI really work for everyone.

  • Productivity vs. Output Gap
    AI tools provide significant productivity gains in code generation, code review, debugging, and testing; these improvements do not always translate into faster release cycles.
  • Developers’ Productivity vs. Actual Changes Merged
    Engineers using AI tools are saving time in some of the repetitive coding tasks, such as prototyping and scaffolding. However, the generated code often gets stuck in the validation queue and does not become a valuable company asset until it is properly merged.
  • The Promise of High ROI from AI
    According to a recent Deloitte study, very few AI implementations across the entire SDLC deliver significant ROI. Progress is hard to measure, and the majority of organisations with substantial AI spending still fail to achieve tangible benefits.
    Only 6% of projects see returns in under 12 months, whilst most will take at least 2 years of continuous investment.
  • Agentic AI Future vs. Reality
    Agentic AI has been made possible by increased accessibility to the existing knowledge base and SDLC infrastructure by LLMs and promises a full end-to-end automation. However, only a small fraction of companies started using it, and of those, only 10% are currently realizing significant ROI.

GerritForge bridges the gaps between AI innovations and ROI

Our mission in 2026 is to help all organisations achieve the expected ROI from their AI investments by identifying and filling the gaps that hinder the success of their SDLC implementation.

The way forward is to engage with current and new customers and introduce new technologies and innovations into their existing infrastructure.

Assess, measure, improve, repeat.

We do believe that everyone can make progress and, at the same time, do more with the money they have invested in AI. We are truly believers that data is the only truth that can drive progress forward and on which everything should be based.

Our product plans for 2026 are all based on a smarter way to measure and improve:

  • Real-time metrics collection from Git repositories
    We will introduce a brand-new real-time data collector, based on the 3 years of R&D and investments made on repository performance, published at the ICSE 2025 in Ottawa (CA). The new component will be able to detect and advertise any repository slowdown caused by AI.
    The component is planned to support GitLab, GitHub Enterprise, Gitea, in addition to Gerrit Code Review, and is extendible to all Git-based SCM systems.

  • Native integration of Code Review experience with LLMs
    Gerrit Code Review v3.14, planned to be released in Spring 2026, will introduce the _”AI Chat”_ with any LLMs. GerritForge will take native support to a whole new level, enabling end-to-end communication and integration with Google Gemini, ChatGPT, and many other popular LLMs.
  • Agentic Gerrit Code Review
    Gerrit MCP, now Open Source, helps developers improve their client-side integration between Gerrit and LLMs using their own credentials. GerritForge will bring this paradigm to the server-side and enable real-life AI Agents to leverage Gerrit MCP for performing analysis and taking actions autonomously, without sacrificing confidentiality, security, and compliance.
  • Scale up and save money with Git, thanks to GHS 2.0
    GerritForge is bringing its GHS technology to a brand-new level, with all the experience and learnings in understanding the traffic generated by AI Agents. The new 2.0 will bring new modern actions and an improved learning model that will be able to react more accurately and bring system resources and costs into the ROI equation.

The future of Git, Gerrit Code Review, and the entire SDLC is now. AI has accelerated the race for innovation, adding speed to the competition. GerritForge is there with you, helping to endorse it and ensuring the whole pipeline will scale, and you can really achieve the ROI that makes your company stand against the competition.

Thank you to our team, our customers, and the incredible Gerrit community for making 2025 a year to remember. Let’s make 2026 THE ONE to remember as a turning point for the whole project, the Git ecosystem, and the community.

The GerritForge Team January 2026

How Git and Gerrit are Re-Tooling for the Age of AI

A Special Report from the Gerrit User Summit 2025

First, a huge thank you to the OpenInfra Foundation for hosting this event in Paris. Their invitation to have the Gerrit User Summit join the rest of the community set the stage for a truly collaborative and impactful gathering.

Paris last weekend wasn’t just a conference; it was a reunion. Fourteen years after the last GitTogether at Google’s Mountain View HQ, the “Git and Gerrit, together again” spirit was electric.

On October 18-19, luminaries from the early days (Scott, Martin, Luca, and many others) reconvened, sharing the floor with the new generation of innovators. The atmosphere was intense, filled with the same collaborative energy of 2011, but focused on a new set of challenges. The core question: how to evolve Git and Gerrit for the next decade of software development, a future dominated by AI, massive scale, and an urgent demand for smarter workflows.

Here are the key dispatches from the summit floor.

A Historic Reunion, A Shared Future

This event was a powerful reminder that the open-source spirit of cross-pollination is alive and well. The discussions were invigorated by the “fresh air” from new-school tools like GitButler and Jujutsu (JJ), which are fundamentally rethinking the developer experience.

In a significant show of industry-wide collaboration, we were delighted to have GitLab actively participating. Patrick’s technical presentation on the status of reftable was a highlight, but his engagement in discussions on collaborative solutions moving forward with the Gerrit community truly set the tone. It’s clear that the challenges ahead are shared by all platforms, and the solutions will be too.

Scaling Git in the Age of AI

The central theme was scale. In this rapidly accelerating AI era, software repositories are growing at an unprecedented rate across all platforms—Gerrit, GitHub, and GitLab alike. This isn’t a linear increase; it’s an explosion, and it’s pushing SCM systems to their breaking point.

The consensus was clear: traditional vertical and horizontal scaling is no longer enough. The community is now in a race to explore new techniques—from the metal up—to improve performance, slash memory usage, and make core Git operations efficient at a scale we’ve never seen before. This summit was a rare chance for maintainers from different ecosystems to align on these shared problems and forge collaborative paths to solutions.

Dispatches from the Front Lines: NVIDIA and Qualcomm

This challenge isn’t theoretical. We heard powerful testimonials from industry giants NVIDIA and Qualcomm, who are on the front lines of the AI revolution.

They shared fascinating and sobering insights into the repository explosion they are actively managing. Their AI workflows—encompassing massive datasets, huge model binaries, and unprecedented CI/CD activity—are generating data on a scale that is stressing even the most robust SCM systems. Their presentations detailed the unique challenges and innovative approaches they are pioneering to tackle this data gravity, providing invaluable real-world context that fueled the summit’s technical deep dives.

Beyond the Pull Request: The Quest for a ‘Commit-First’ World

One of the most passionate debates centered on the developer workflow itself. The wider Git community increasingly recognizes that the traditional, monolithic “pull request” model is ill-suited to the “change-focused” code review that platforms like Gerrit have championed for years.

The benefits of a change-based workflow, cleaner history, better hygiene, and higher-quality atomic changes—are driving a growing interest in standardizing a persistent Change-ID for each commit. This would make structured, atomic reviews a first-class citizen in Git itself. The collaboration at the summit between the Gerrit community, GitButler, JJ, and other Git contributors on defining this standard was a major breakthrough.

This shift is being powered by tools like GitButler and JJ, which are built on a core philosophy: Workflow Over Plumbing. Modifying commits, rebasing, and resolving conflicts remain intimidating hurdles for many developers. The Git command line can be complex and unintuitive. These new tools abstract that complexity away, guiding users through commit management in a way that feels natural. The result is faster iteration, higher confidence, and a far better developer experience.

AI and the Evolving Craft of Code Review

Finally, no technical summit in 2025 would be complete without a deep dive into AI. The arrival of AI-assisted coding is fundamentally shifting the dynamic between author and reviewer.

Engineers at the summit expressed a cautious optimism. On one hand, AI is a powerful tool to accelerate reviews, improve consistency, and bolster safety. On the other, everyone is aware of the trade-offs. Carelessly used, AI-generated code can weaken knowledge sharing, blur IP boundaries, and erode a team’s deep, institutional understanding of its own codebase.

The challenge going forward is not to replace the human in the loop, but to strengthen the craft of collaborative review by integrating AI as a true co-pilot.

A Path to 100x Scale: The GHS Initiative

The most forward-looking discussions at the summit centered on how to achieve the massive scale required. One of the most promising solutions presented was GHS (Git-at-High-Speed). This innovative approach is not just an incremental improvement; it’s a strategic initiative designed to increase SCM throughput by as much as 100x.

The project’s vision is to enable platforms like Gerrit, GitLab, and GitHub Enterprise to handle the explosive repository growth and build traffic generated by modern AI workflows. By re-architecting key components for hyper-scalability, GHS represents a concrete path forward, ensuring that the industry’s most critical SCMs can meet the unprecedented demands of the AI-driven future.

The Road from Paris

The Gerrit User Summit 2025 was more than a look back at the “glorious days.” It was a statement. The Git and Gerrit communities are unified, energized, and actively building the next generation of SCM. The spirit of GItTogether 2011 is back, but this time it’s armed with 14 years of experience and a clear-eyed view of the challenges and opportunities ahead.


Antonio Barone – Gerrit Maintainer, Release Manager
Luca Milanesio – Gerrit Maintainer, Release Manager, Gerrit Engineering Steering Committee
Jacek Centkowski – Gerrit Maintainer

2023: New Year and opportunities for GerritForge and Gerrit Code Review

TL;DR: GerritForge has been dedicating its efforts to organising and managing the Gerrit User Summit in London back in November 2022, in conjunction with the release of Gerrit v3.7. The event has been a great success, with a significant presence on-site and record-breaking attendees on the GerritForge TV youtube channel. It has also committed to its promises to research and improve the JGit and Gerrit scalability to large mono-repos, with tens of millions of objects and refs. 2023 will see the finalisation of these efforts with an increase in development efforts and a new JGit Committer for pushing the platform to a new level of performance and scalability and a new innovating system for collecting and optimising the repository metrics automatically. Stay tuned.

Read the full story here below (9 mins read).


2022 has been a critical year for turning the Gerrit Code Review community and development back on track after the COVID-19 pandemic. At GerritForge, we’ve been working hard to make sure that the development, support, and innovation of Gerrit Code Review continue on its main objectives.

Gerrit Code Review v3.6 and v3.7

We have continued to deliver on the development and release of Gerrit Code Review and its plugins, helping the testing and releasing of versions v3.6.0 (May) and v3.7.0 (November).

Some numbers of the past 12 months’ development contributions by individual committers and companies:

  • 3,627 Changes have been merged on 76 projects related to the Gerrit Code Review platform, including JGit
  • 113 committers from 42 different organisations

A special mention to the top #10 contributors: Google (Ben Rohlfs, Edwin Kempin, Chris Pouchet, Dhruv Srivastava, Frank Borden, Milutin Kristofic), GerritForge (Luca Milanesio), Wikimedia (Paladox) and SAP (Matthias Sohn and Thomas Dräbing).

In comparison with 2021, we had 25% fewer changes merged but with more contributors coming from more companies, which is a symptom to a very healthy and thriving ecosystem of maintainers.

GerritForge has committed to resuming the face-to-face user summits, which were suspended since 2020.

The Gerrit User Summit 2022 took place in London, UK the 10-11 of November in a hybrid format, with people having the opportunity to participate either on-site or remotely on GerritForge’s YouTube TV channel.

It was a glorious success, with record-breaking attendance from all around the globe:

  • 50 people registered to attend on-site, 26 of them managed to arrive despite the London tube strike, whilst the others attended remotely
  • 235 people viewed the summit on YouTube with an average view time of 40 mins (one talk)

The summit survey had an outstanding report showing a huge acceptance and appreciation of the event:

  • 82% rated the remote video streaming as “good” or “outstanding”
  • 96% rated the quality of the summit as “good” or “outstanding.”
  • 100% would recommend the summit to a colleague, with 83% strongly recommending it

GerritHub.io SLA gets closer to five-nines.

We have been working hard to make Gerrit more stable and resilient throughout 2022, discovering and fixing many issues in the code base and on the multi-site software architecture.
In 2022, GerritHub.io had only six small hiccups for a total of 19 mins of downtime (SLA = 99.997%) over a 12-month period, a 75% reliability improvement compared to 2021.

We have run extensive RCAs on the causes of the downtime and identified two leading issues, which are explained in the details below.

The “anonymous unlimited query” hole in Gerrit
GerritHub.io has been subject to a 15 mins outage because of anonymous users being able to bring offline all the sites before the system could auto-recover.
Gerrit allows bypassing of all limits set in the ACLs for running queries by simply adding the “no-limit” parameter.
Returning an arbitrary payload without limits could allow a single user to generate a server-side workload for collecting and building a GBytes-sized JSON payload; unfortunately, that option was available to everyone, including anonymous users making any publicly faced Gerrit Code Review installation subject to deny-of-service attacks.
We have identified the issue, reported and fixed it in Gerrit with Change 333304, which has been included in Gerrit v3.3.10, v3.4.4, v3.5.1, and all v3.6.0 or later releases.

More granular monitoring and alerting
We have lowered the threshold of uptime checks on GerritHub.io to 1 minute, giving us the ability to detect and react immediately to 4 smaller hiccups. We have detected a lack of scalability for some specific higher-load projects. Those hiccups have been responsible for 2 mins of downtime over the 2nd part of 2022. Many more projects are also planning to be onboarded on GerritHub.io; hence we do need to address this project-specific capacity needs.

Scaling Gerrit Code Review and JGit beyond its limits

We have been investing a massive effort in building a test environment designed to stress Gerrit and JGit to its limits and identify all the limitations and bottlenecks that prevented us from scaling further.

Scaling the test repository
We have created over the months some test repositories that increased in every dimension:

  • Tens of millions of refs as both refs/changes and refs/heads
  • Millions of delta-chains
  • Tens of millions of Git objects
  • Packfiles of tens of Giga-bytes and packed refs of hundreds of megabytes

For generating a significant load on both client and server side, we have invested more into the aws-gerrit cloud setups and gatling-git performance loading tool.

There were some “well-known” issues and additional surprising ones.

SHA1 complexity and CPU utilization for large entities
JGit has been used SHA1 for identifying uniqueness not just for Git objects but also for other large entities. However, computing SHA1 has become increasingly CPU intensive because of the relatively recent findings about collisions on shattered.io.
We have highlighted two major potential improvements in cooperation with Matthias Sohn (SAP) on the raw SHA1 performance and its application for detecting packed-refs changes on the filesystem.

Commit priority queues
JGit has a custom implementation of priority queues which are intensively used in RevWalk, which has almost quadratic complexity. That isn’t a problem for small to medium chains of commits; however, when the number of commits reaches millions, the performance degradation becomes unbearable.
We have replaced the JGit’s custom implementation with the one provided by the Java JVM library, which has a logarithmic complexity that massively improves its performance with large commit chains.

Unwanted reachability checks
JGit needs to perform a full reachability check whenever a remote unknown client is advertising refs, which makes sense when serving a remote client. However, the cost of full reachability of millions of advertised refs can be a daunting task that may be alleviated if the remote end can be considered trusted.

Fixing JGit bitmaps
Since the introduction of Git bitmap, the whole community has learned how key they are in speeding up the counting and selection during the clone phase.
However, large and unoptimized bitmaps could be so unhelpful for Git that instead of speeding up, they could represent a massive overhead for the system, causing CPU spikes and, eventually, lowering the throughput of the server.
Git bitmaps are compressed using the JavaEWAH library, which is good for memory consumption but evil for CPU utilization: that is the reason why the smaller is best for performance.
We have discovered and fixed a critical issue with the JGit bitmap generation that was causing the inclusion of all commits and BLOBs pointed by annotated tags. Also, we have introduced the ability to inform JGit about the heads that can be excluded from the bitmap, allowing to shorten the creation tens of thousands times (5h generation time for a 2k refs to as little as 60s) and increase its effectiveness by 200%.

Millions of unneeded ref logs
When performing a clone of a repository with millions of heads, JGit created one local reflog file for every remote ref, including the ones there were not actually cloned but just fetched as remote references. This was creating a significant performance gap between JGit and Git, which would instead lazily create the reflog files once they are effectively checked out the first time. Cloning a single branch of a repository with millions of remote refs took around 1h, compared to a few minutes of Git.

All of the findings were included in multiple updates on the following components:

  • JGit changes: all fixes were also provided to stable-5.13, the last supported branch for Java 8, which allows benefiting from these improvements for older versions of Gerrit from v2.16 onwards.
  • pull-replication went through major performance improvements, achieving a 1000x times faster execution time compared to the traditional replication plugin
  • aws-gerrit is going through upgrades for making use of pull-replication plugin, including the support for the bearer token which allows to replicate virtually any repository, including All-Users.git
  • gatling-git: we have upgraded the Gatling version and JGit to the latest stable-5.13 to include the latest performance improvements.
  • git-repo-metrics: we have introduced a brand-new plugin that allows us to keep under control the major dimensions of a repository and therefore graph their increase over time.

GerritForge goals for 2023

We are definitely not done yet with the performance improvements on Gerrit and JGit: there are still significant improvements to be made, and JGit changes to get merged into the mainstream branches.
We believe we are on track to finalize the job and allow a stable and scalable platform for large Git repositories in 2023.

Finalise what we cooked in 2022 for JGit
JGit has a new maintainer, David Ostrovsky, awarded in 2022 as Git committer of the project. GerritForge’s devs are focused to get more reviews and attention to the JGit performance improvements. We are committed to finalising all the open changes related to large repositories.

JGit multi-pack indexes support
There is still a major gap between JGit and Git when dealing with very active repositories: multi-pack indexes. The proliferation of packfiles would eventually lead to a long and painful search-for-reuse phase for BLOBs which could be cut down 100s of times with a multi-pack index.

Git repository optimiser for Gerrit
We have been working on tracking the live information on the Git repository, thanks to the git-repo-metrics plugin. Wouldn’t it be nice to have a tool that can do something with it and automatically?
We would be doing R&D on how to correlate the repository metrics, the Git audit trail, and the performance data for making AI-based decisions on what needs to be improved on the repository.
This work stream is going to be useful for any Git repository, not just the ones powered by Gerrit Code Review. The ‘git-repo-metrics’ and the repository optimiser would also apply to other products, including GitHub and GitLab.

Gerrit v3.8 and projects-specific change numbers
We will finalise the design document for the transition to project-specific change numbers in Gerrit v3.8. That would allow the seamless migration of projects across Gerrit setups without having to worry about changes renumbering anymore.

Gerrit Code Review testing and GerritForge-certified binaries
GerritForge is spending a tremendous amount of time developing test environments and tools for serving the Gerrit community with more stable releases and improving the quality of its code. We want to intensify the effort and also offer our platinum support customers a unique service that includes the GerritForge digital signature and rubber stamp on the binaries of Gerrit Code Review and its plugins that have been successfully tested and validated for being production-ready.
Stay tuned; more details are coming soon …

GerritForge company forecast in 2023

GerritForge Inc. will finalise its roll-out to the USA, and all contracts and services will be run from Sunnyvale, CA and Europe. Over 2022, 60% of the customers and businesses have already been moved, and the operation will be completed over the course of 2023.

We are looking forward to doubling our revenue figures in 2023 and also our contributions to the open-source community, with a main focus on JGit as the driver of performance growth for Gerrit Code Review.


2023 is going to be an incredible year for GerritForge, Gerrit Code Review, and the JGit community altogether.

Happy New start of the Year 2023!

Luca Milanesio (GerritForge)
Gerrit Code Review Maintainer and Release Manager
Member of the Gerrit Engineering Steering Committee

New year, free GerritHub: unlimited private reviews with anyone, forever

Today GitHub has announced the extension of its free plan to include unlimited private repositories. This is great because allows a lot more people to start experimenting their side projects and keep them confidential until they ready to be shared publicly.

GerritHub.io allows extending this amazing offer by having a fully-featured code review process on top of their GitHub private repositories and still keep the confidentiality needed for early-stage projects. Differently from GitHub, however, GerritHub allows you to have an unlimited number of reviewers and collaborators, for free, forever.

A wonderful new 2019 is starting with two amazing free offers to allow everyone to experiment and unleash their potential:

  • Free unlimited repos from GitHub, limited to 3 collaborators
  • Free unlimited repos from GerritHub, with unlimited collaborators for reviews

That’s super-cool, how do I start?

Getting started with your private GitHub repositories on GerritHub is easy:

  1. Go to https://review.gerrithub.io
  2. Click the top-right “Sign-in” link
  3. Select “Private” option and click the top-right “Login” button
  4. Enter your GitHub credentials
  5. Allow GerritHub to access in reading/writing your private repositories
  6. Select the GitHub SSH keys and profile into Gerrit, and click the top-right “Next” button
  7. Select the organization and repositories to import into GerritHub, and click the top-right “Import” button
  8. Select the GitHub PRs you want to import into GerritHub for review, and click the top-right “Import Selected” button

Once you’re done with the above steps, you’re up-and-running with GerritHub and you are free to invite collaborators and accept reviews.

You can follow the GerritHub video on YouTube which describes the above process.

I am new to Gerrit Code Review, where do I start?

There is plenty of information on the web about Gerrit Code Review. The best place to start is the project’s tutorial in the documentation.

Alternatively, you can watch the presentation by Shawn Pearce, the Gerrit Code Review project’s founder.

 

Have questions? Get in touch with the Community.

In case of issues, questions, you can get in touch with the Gerrit Code Review Community, and they will be happy to guide you through and provide support.

Want to use Gerrit into your Enterprise?

If you decide to use Gerrit Code Review in your Enterprise and you need the service level compliant with your company standards, you can get in touch with GerritForge which offers the full coverage of the Enterprise Support you will need:

  • Silver: 8×5 Support, with 24h turnaround for P1 issues
  • Gold: 24×7 Support, with 8h turnaround for P1 issues
  • Platinum: 24x7x365 Support, with 4h turnaround for P1 issues

What’s next?

With GitHub and GerritHub you have no excuses anymore to start innovating right now, with free unlimited repositories and free unlimited Gerrit reviewers and contributors.

Go and innovate, the future is now. 

GitHub acquired by Microsoft: what’s next?

The world woke up this morning with shocking and exciting news at the same time: GitHub is going to be a Microsoft Business.
There are mixed feelings and GitLab already reported a tremendous increase in its rate of imported projects from GitHub and a record of registration of new accounts all tagged with the #MoveToGitLab Twitter hashtag.

Do not press the panic button

Microsoft had, unfortunately, a historical record of acquisitions that did not go very well. However, that doesn’t mean that GitHub is going to follow the same path.

The question is: what is going to change in the next few weeks? Possibly nothing at all. It is not the time to panic and looking frantically for quick alternatives without really thinking about it. GitHub is there, works and is not going to change in the near term.

Looking for more independence and Openness

One thing that people should do right now, is to say with GitHub and keep their presence as it is today. At the same time, it is clear that economics of the staggering $7.5Bn price tag will start to impact the future decisions and the bias of their services, but nobody knows when and how.

If you are looking for something better, more open and more powerful, you should look at what the best of the OpenSource community proposes on Git and Gerrit Code Review.

OpenSource Code Review, 10 years of independence

Gerrit Code Review was founded the 1st of October 2008 by Google and, since then, has been paramount of Openness and vendor neutrality. There is NO “Community” vs. “Enterprise” editions, no “vendor-locking”, no pull-request filtering for enterprise-class features.
According to the Official Gerrit Analytics page (http://gerrit-analytics.gerritforge.com), over 160+ organizations contributed to Gerrit a stunning 36k commits and the project keeps growing.

Gerrit Code Review project contributions since its inception over 10 years

Screen Shot 2018-06-04 at 14.29.23

Try Gerrit Code Review workflow and stay on GitHub

Since 2013, a new service called GerritHub allows OpenSource projects and private companies to leverage Gerrit Code Review workflow and keep their public presence on GitHub.
In addition to a much more powerful and functional workflow, they get for free the ability to be discoverable on GitHub and accept contributions as Pull Requests.

What if I want to leave GitHub anyway?

Should you decide to stay on Gerrit Code Review and leave GitHub in the future, you will always have your repos and reviews on Gerrit and decide to cancel your GitHub subscription at any time, without any consequence to your Community.

So, why not giving Gerrit Code Review a try?
https://review.gerrithub.io/static/intro.html

 

 

 

Gerrit Summit 2016 is coming

google

Four weeks from now, the eighth edition of the Gerrit User Summit will open its door at Google HQ in Mountain View – CA, 12th-13th of November 2016.
It has been a long journey since the first GitTogether in 2008, and after the split between the Git[Hub Universe] summit and the traditional “unconference” style Gerrit event at Google’s, things have changed quite a lot. While Gerrit remained a 100% OpenSource user-centric project, GitHub has attracted $350M in VC, and they have been losing traction over the years to join the unconference-style events.

What’s new this year?

For the first time, the proposals of talks to the Gerrit User Summit are submitted in Gerrit directly (yeah!) on the summit/2016 repository.

The list of currently approved talks is available by searching for “status:merged project:summit/2016” (https://gerrit-review.googlesource.com/#/q/status:merged+project:summit/2016)
The talks awaiting review are under “status:open project:summit2016” (https://gerrit-review.googlesource.com/#/q/status:open+project:summit/2016)

How cool is that? I foresee already a Doodle plugin for Gerrit 😉

How to register for the User Summit?

Shawn Pearce has prepared a Registration Form for you to sign-up to the event:
https://goo.gl/forms/oeEnQweHl2noNSnn1

Once you access the Registration Form at the above URL, you need to sign-in with your Google Account credentials and then complete the following information:
– Your name
– Your Organisation
– Your previous attendance to the user summit
– Any dietary restrictions

The User Summit is FREE for EVERYONE, including novice users of Git and Gerrit Code Review, but you would need to register beforehand.
The Summit is a unique opportunity to learn about Gerrit new feature, contribute to the product roadmap with your needs and requirements and, most of all, network with other users to learn new use-cases where Gerrit can be very helpful.

How to submit my talk proposal?

Well, you need to demonstrate a good understanding and use of Gerrit Code Review if you want to teach and talk to other people about it! At the end of the day, if you want to talk about Gerrit you should be able to clone a repository and submit a patch to a project 🙂

If you need just a little help … see my “Diffy super super talk” example:

$ git clone https://gerrit.googlesource.com/summit/2016 && (cd 2016 && curl -Lo `git rev-parse --git-dir`/hooks/commit-msg https://gerrit-review.googlesource.com/tools/hooks/commit-msg ; chmod +x `git rev-parse --git-dir`/hooks/commit-msg)
$ cd 2016
$ cat - > sessions/my-amazing-talk.md
# My amazing talk at Gerrit User Summit

Hi folks, this is my super-duper-talk. You should be interested in it as I will unleash the dark force of Code Review Diffy Kung Fu Review Cuckoo.

*Diffy, Birds & CO. Inc.*
^D
$ git add sessions/my-amazing-talk.md && git commit -m "Diffy super-duper talk"
$ git push origin HEAD:refs/for/master

Talks highlights.

There are already some fascinating talks submitted and approved and more will undoubtedly come in the next couple of weeks. We will start sharing some highlights of what’s happening at the conference. Here is the overview of the first talks.

What’s new in Gerrit 2.12 and 2.13

Two major versions of Gerrit have been released since the last summit in 2015, and they contain significant improvements to the platform:

  • Topic submission workflow – aka Git commits across repositories  (v2.12).
    Group multiple changes in a “topic” and having them merged as a whole, even across multiple repositories, in a single submit operation.
  • GPG signed pushed verification (v2.12).
    Allows people to upload their GPG public keys into Gerrit and have them used to verify Git signed commits.
  • Large File Storage support (Git LFS) (v2.13).
    Gerrit finally supports the automatic management of large files outside the Git repository. The feature is fully pluggable and exposed via plugins. Amazon S3 and Local file system support are available at the moment, but more plugins are here to come on this feature.
  • Gerrit metrics (v2.13).
    Expose the internal metrics to external consumers. The feature is exposed for plugins to gather this data and send to external systems for analysis and visualization purposes. Graphite, ElasticSearch, and JMX plugins are available.
  • Hooks plugin (v2.13).
    Finally, the Gerrit hooks mechanism have been entirely externalized and implemented in a pluggable way. The legacy hooks have become a core plugin. However, you can now leverage the new extension to develop a new-generation of hooks by leveraging the new extension points provided.
  • New HTML5 UX with WebComponents – PolyGerrit preview (v2.13).
    The next generation of Gerrit UX based on Polymer Web components is available. Even though not complete, offers a sneak preview of what the new interface looks like and, if you like it as-is and is good enough for your use-cases, you can enable and start using it already. Both GWT and Polymer-based UX are using the same REST API, and thus the changes generated and reviewed with them are 100% interoperable.

There is more to come.

In the next few days we will keep on publishing the highlights of the topics coming at the Gerrit User Summit this year, stay tuned and REGISTER NOW at:
https://goo.gl/forms/oeEnQweHl2noNSnn1

The GerritForge Team.

How to Migrate a Git Repository

When and why?

We wrote yesterday about the GitEnt-Scm.com shutdown due on April 30th, 2016. Now the issue you would be facing is: how to migrate somewhere else?
Although StackOverflow already contains over 800 response threads when asking this question we thought that giving a practical example based on a real-life GitEnt repository would allow you to avoid the trial & error discovery.

Step 1 – Mirror clone

When you want to clone a repository for the purpose of migration, you really want everything, including all the other refs that are not branches:

  • Git Tags (refs/tags/*)
  • Git Notes (refs/notes/*)
  • Gerrit Reviews (refs/changes/*)
  • Gerrit Configs (refs/meta/*)

Instead of using a standard clone, you can do a “git clone –mirror”, which implies –bare and thus does not generate a working copy.

Example:

$ git clone --mirror ssh://myuser@gitent-scm.com/git/myorg/myrepo.git
Cloning into bare repository 'myrepo.git'...
remote: Counting objects: 109, done
remote: Finding sources: 100% (109/109)
remote: Total 109 (delta 19), reused 83 (delta 19)
Receiving objects: 100% (109/109), 66.42 KiB | 0 bytes/s, done.
Resolving deltas: 100% (19/19), done.
Checking connectivity... done.

Step 2 – Create empty repo on the new Git Server

You need to have an empty target repository where to push your mirrored local clone. Note that most of the Git Servers propose you to create a first master branch with a README, but, in this case, you do not need it and it would only create more trouble in your migration path.

Example for GitHub:

– Go to https://github.com/new and create the ‘myrepo’ repository
– Do not tick any of the suggested README or LICENSE auto-generation
– Once the project is created, GitHub provides you with the repository Git URL (e.g. git@github.myorg/myrepo.git)

Step 3 – Push to the new Git Server

You are now ready to push to the target repository, and we can use the useful option “–mirror” again.
Similarly to the clone, “–mirror” automatically include all refs, including the non-branch ones (tags, notes, reviews, configs, …); it provides the behaviour of removing all the refs that are not present in your local clone. You should never use this option when you have a “regular default clone” as you would risk removing all the remote refs that have not been typically cloned with a standard default “git clone” operation.

Example for GitHub:

$ git push --mirror git@github.myorg/myrepo.git
Counting objects: 109, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (61/61), done.
Writing objects: 100% (109/109), 66.42 KiB | 0 bytes/s, done.
Total 109 (delta 19), reused 109 (delta 19)
To git@github.myorg/myrepo.git
* [new branch] refs/changes/02/802/1 -> refs/changes/02/802/1
* [new branch] refs/changes/03/803/1 -> refs/changes/03/803/1
* [new branch] master -> master
* [new branch] refs/meta/config -> refs/meta/config

Step 4 – Import into GerritHub.io (Optional)

Your repository has not been fully migrated to your new target server. If you wish now to keep on using Gerrit Code Review for your Development Workflow, you can link your repository to Gerrit using GerritHub.io

The YouTube Video explains how to perform this last operation using GerritHub.io import Wizard.

Need more help?

Do you require more help? Contact our Sales Departement at sales@gerritforge.com and we will provide the extra support you need or perform the migration for you to GerritHub.io.

GerritHub user-controlled GitHub Scopes

Nowadays people are very careful about privacy and user data: nobody grants access to their profile without checking first the possible consequences.
We want to give the user the ability always to know and control what level of access is given to their data: that’s why we improved the way you login in GerritHub.io.

GitHub scopes: what is it?

GitHub provides the authentication and access to user’s profile using a protocol called OAuth 2.0. When GerritHub is requesting a user to authenticate is then granted a set of permissions to operate on behalf of the user on their GitHub resources, which include:

  • User’s personal data (name, e-mails)
  • User’s membership to organisations and teams
  • User’s repositories

The set of permissions to access and operate on your data is also known as “Scope” in GitHub terms.

How is GerritHub helping me to control my access?

GerritHub has from today a new “Scope Selection” screen with two main objectives:

  1. Displaying your current scope and associated rights GerritHub has on your GitHub profile
  2. Giving you the ability to switch to a different “Scope” and consequently the rights that GerritHub has on your profile data

Screen Shot 2015-10-08 at 10.16.53

Transparency is good, but what is the practical added value?

There has been in the past a common complaint about GerritHub having too much or too little access to your GitHub profile:

  • Too much? Why GerritHub.io needs access to my e-mail address? Why does GerritHub need to see my public keys?
  • Too little? Why does GerritHub not show my private repositories in the import screen? How can I see my organisation membership in GerritHub project security screen?

With the ability now to visualise and change the current “Scope”, people can be more aware of why things are not showing up. They can make conscious decisions about how to change them with full transparency on the associated implications.

A common scenario: importing and accessing private GitHub Organisations, Teams and Repositories.

When you need to import an existing private GitHub project, you need to access information that is not publicly available:

  • Your membership to a private organisation
  • Your ownership of a Team structure
  • Ability to clone and push your private organisations’ repositories

There is now a special information box suggesting that you have the ability to change your “Scope” if you don’t see the organisations and repositories you want to import.

Screen Shot 2015-10-08 at 10.22.12

After changing the scope, you can then log in again and you will have an improved set of options to get more data and repositories from your GitHub account.

Like it? Will you use it on a daily basis?

We are eager to get your feedback on this new feature: Tell us what you think and let us know what you would change or add to the set of “Scope” permissions.

GitHub outage, again :-( What is the real cost of FREE services?

Screen Shot 2015-05-06 at 12.47.43

As a bitter surprise today, we are experiencing another GitHub outage. This time it seems a more serious problem than the average DDoS: GitHub’s Ops Team is perform an emergency maintenance on the whole site to recover the situation.

How much a FREE GitHub Service outage really costs me?

Everyone loves GitHub because it is nice, easy and most of all … it’s FREE ! Lots of projects started using it for much more than pure source code versioning:

  • People write books and documentation with it (see gitbook.com)
  • Teams started using it as free artifacts repository manager: projects wouldn’t build at all when GitHub is down
  • Companies started hosting web-pages on GitHub (see the nicely rendered microsoft.github.io)
  • GitHub Issue tracking and wikis are so simple that people are using for project collaboration

When everything works, it is amazing how your Team can be productive using GitHub on a daily basis. But when it fails, what can you do? And what if my Team cannot progress because they can’t see the tasks, wikis, requirement documents, web-pages … how much money am I really wasting when people is hanging around for hours?

Let’s consider a small Agile Team composed by a 1 x BA, 8 x Agile Devs, 1 x Scrum Master, 2 x DevOps and 2 x QA: a 30′ minutes outage like the one today would have an impact on 16 people of 1 man/day that means (for the US market) roughly $1,000 (as optimistic guess, it may cost even more). Even if GitHub goes down twice a year (gosh this happened more than twice I am afraid) your start-up will end up paying around $ 2,000 /year for GitHub. The overall amount doesn’t sound that expensive … but you wonder why GitHub “was supposed to be really FREE” if you end up spending money with it.

If we apply the same figures to a medium size company with at least 160 people working on development, your overall figure would jump to $20,000 /year. More importantly the time lost and delays caused on the project schedule may then have an avalanche effect on other teams and maybe causing additional  pain and costs across your organisation and programme plan. Those extra costs can be sometimes difficult to quantify but for sure are much more relevant on your overall business.

Shall we give up using GitHub then? Or shall we move to GitHub:Enterprise instead?

The typical reaction to a GitHub outage is: “we cannot rely on the FREE version, we should buy GitHub:Enterprise which will run inside our company network” and use this argument with your manager to get a Purchase Order finalised NOW (I may be too malicious … but a outage may actually generate more money to GitHub than loss of reputation). When you look at the GitHub:Enterprise pricing it ends up that for your 160 people you would need to spend only $36,000 /year which seems on the same order of magnitude of your $20,000 wasted money without considering the extra hidden costs of project delays.

But are you really solving the problem? GitHub and GitHub:Enterprise are the same product, same code-base, just different pricing. What makes you wonder that your internal Ops Team can do a better job than GitHub? What makes you wonder that a GitHub bug would not appear on your GitHub:Enterprise set-up? Are you just an optimistic person?

Moving to GitHub:Enterprise is typicall needed when you have compliance / security requirements on data at-rest, but is not really addressing the problem of reliability and would potentially expose your Team to even further outages for software upgrade and management that typically you don’t have using GitHub alone. You are then spending $36,000 on top of your $20,000 (or even more) wasted previously without having real benefits.

Learning how to fly with GitHub

How to solve the problem then? Can we learn from somebody’s else experience?

Airplanes have exactly (if not even more demanding) requirements on their engines as we on a Version Control System. For an aircraft the cruising speed is everything, without that speed provided by its engines he cannot fly; we have similar requirements in our Development Team where GitHub is really what we need for progressing our development otherwise we are blocked.

The solution to the problem for an airplane to be reliable is not buying more expensive engines (which are not necessarily more reliable) but instead using two engines instead of one. Can we apply the same to GitHub? GitHub is in a nutshell a Git Server, why not relying on redundancy and replication? Can I set-up a replica of GitHub and use it for my reviews?

You can of course build your own replica using plain Git and GitHub WebHooks: it would require a bit of scripting but it can be done. During an outage you can use the replica and when GitHub is back all the pending changes can be pushed back to GitHub.

Can I have another FREE and automated replica of GitHub?

This is becoming challenging now: we want something that is completely FREE (no time spent in writing scripts, webhooks, no service provider to pay, no commercial product) but that allows us to use GitHub replicated, including Code Reviews.

It seems strange but what we are looking for actually exists and it is an OpenSource project called Gerrit Code Review. It is not only a Code Review and Git Server like GitHub but offers as well more advanced security and replication capabilities. It has been designed taking into account the needs of large distributed Teams and making their daily development lifecycle more reliable independently from local failures.

Cool, how can I get started with Gerrit and GitHub now with no hassles?

You read this quick introduction for getting started in setting up your private replica or, you are really in a hurry and you wanted a FREE hosted service, you can sign-up with 3 clicks to GerritHub.io.

I have only 5 mins of free time today: what can I read/watch to understand how it works?

Well, there are plenty of resources but if you are really in a hurry, you can watch the following YouTube Video:

If you have more time, you can read the Gerrit Code Review overview and tutorial at: https://review.gerrithub.io/Documentation/intro-quick.html

Get ready now to avoid wasting again money when the next GitHub outage … that nobody wishes … will (sadly) happen 😦

Pingdom status for GerritHub.io

Image

Screen Shot 2014-09-07 at 23.39.01You can check the status of GerritHub.io services in real-time thanks to the public page offered by PingDom.com

The GerritHub.io status page is http://status.gerrithub.io and displays:

  • Current status with response time
  • History of the past 7 days with uptime
  • Details of the last 24 hours

Tonight for instance reports a temporary service outage (3 times, around 15′ each) caused by an intermittent unavailability of the GitHub API. As GitHub was not able to provide the validity of its OAuth code credentials, GerritHub was not able to allow the completion of its login handshake and thus resulting in a partial outage.

We will use the PingDom.com reports to reinforce our production infrastructure and make GerritHub.io more resilient in the future. For instance in this case (GitHub API unavailable) we will look at reusing cached credentials for allowing known people with non-expired Gerrit cookies to complete their operations. For unknown users, we will display next time a courtesy message explaining that the sign-up is unavailable for GitHub API temporary outage, avoiding the allocations and time-outs of HTTP connections.