A cartoon person holding hands with a robot both with multiple speech bubbles

Establishing trust in using GitHub Copilot

Ryan Salva
Ryan Salva // VP of Product // GitHub

Integrating emerging technologies into your business is bound to raise some important questions. Below we will answer the most common questions raised by our customers when evaluating GitHub Copilot for their business.

In this guide you will learn:

  • How GitHub implements technical and contractual safeguards in GitHub Copilot 

  • How to navigate the contractual terms of your GitHub Copilot purchase, whether you purchased directly or through Microsoft

Establishing trust in GitHub Copilot is best done by reviewing our Trust Center to understand the measures GitHub implemented in GitHub Copilot.

What technical safeguards are in place?

GitHub has created a duplication detection filter to detect and suppress GitHub Copilot suggestions that contain code snippets that match public code on GitHub. Your enterprise administrator can choose to enable this filter for all organizations within your enterprise, or they can defer control to individual organizations. 

With the filter set to “Block”, GitHub Copilot compares the lexeme suggestions against the lexemes indexed in the public code repositories on GitHub. If a suggestion contains 65 or more lexemes (about 150 characters) that match anything in GitHub’s public repositories, the suggestion will not be shown to the user. Read more about GitHub Copilot’s data pipeline in the next guide in this module.

While our experiments have shown that GitHub Copilot suggests code of the same or better quality than the average developer, we can’t provide any assurance that the code is bug free. Like any programmer, GitHub Copilot may sometimes suggest insecure code. We recommend taking the same precautions you take with the code written by your engineers, for example linting, code scanning, IP scanning, pull request reviews, etc. 

To us, GitHub Copilot represents the quickest, and safest path towards fully leveraging AI to empower our developers. Not only are its current suggestions better than other offerings, its product roadmap aligns with where we think the technology needs to be headed.

Alexander Hanl
Alexander Hanl // PM for CI/CD environment // CARIAD

Contractual protection

Both GitHub and Microsoft provide indemnity for their Copilot products. This means that when you use GitHub Copilot, you're backed by a contractual commitment: If any suggestion made by GitHub Copilot is challenged as infringing on third-party intellectual property (IP) rights and you have enabled the duplicate detection filter, our contractual terms are designed to shield you. This indemnification, detailed in agreements like the Microsoft Business & Services Agreement (MBSA) and the individual Microsoft Product Terms, extends explicitly to the outputs of Microsoft and GitHub’s generative AI services, offering a layer of security that complements GitHub Copilot’s technical measures.

While the baseline risk of infringement with GitHub Copilot is already low due to the nature of the technology and its legal standing, our multi-layered strategy further reduces this risk. This framework is based on our confidence in GitHub Copilot’s compliance with IP laws, and is a reflection of our commitment to protecting our customers as you navigate the exciting landscape of AI-powered development.

Who owns GitHub Copilot outputs?

GitHub does not claim ownership of a suggestion produced by GitHub Copilot. 

Which legal terms and contracts apply to my business?

GitHub Copilot Business and GitHub Copilot Enterprise customers are entitled to IP indemnification from GitHub for GitHub Copilot’s unmodified suggestions when the duplicate detection filter setting is set to “Block”. In other words, if you enable this feature, we will defend you against the copyright claim in court. 

As you integrate GitHub Copilot into your organization, you will navigate a set of contractual terms that ensure both compliance and clarity in usage. You will need to review the respective terms based on your purchasing channel (through Microsoft or directly through GitHub). Understanding these will help you align GitHub Copilot’s use with your legal and operational requirements. Below, we provide a brief overview that will guide you toward identifying the legal terms most relevant to your situation.

Your contract stack (the various legal agreements covering each party’s legal obligations) depends on your purchasing route. Each stack has its precedence order in case of conflicting terms:

  • Purchasing through GitHub: The GitHub Customer Agreement (GCA) and the Product Specific Terms for GitHub Copilot cover critical elements like code ownership and data usage. The terms in these documents are part of your contract stack the moment you start using GitHub Copilot. Familiarize yourself with them at GitHub's Customer Terms page.

  • Purchasing through Microsoft: Purchasing through Microsoft introduces documents like the Microsoft Business & Services Agreement (MBSA), as well as the Microsoft Product Terms. The Product Terms include terms that apply to all Microsoft services (the “Universal License Terms”), and product terms specifically for GitHub (the “Microsoft Product Terms for GitHub Offerings”). The contracts from Microsoft outline the scope of your legal coverage and obligations when using GitHub Copilot.

We know that understanding the legal terms that apply to your purchase is vital in ensuring a compliant and effective use of GitHub Copilot, aligning its AI capabilities with your enterprise’s legal frameworks. For more information on the various documents that may apply to your purchase make sure to check out the GitHub Copilot Trust Center

Up next: How GitHub Copilot handles data

Now that we’ve learned about safeguards implemented in GitHub Copilot, let’s take a look at what data GitHub Copilot collects and how that data is used, transferred, stored, and where its lifecycle ends.

Up next: How GitHub Copilot handles data

Get started with GitHub Copilot