GitHub Next Logo

GitHub Copilot Radar

GitHub Copilot Radar supports code navigation with a constantly updated view of other code locations most relevant to the one you’re currently viewing or editing. As a jump-to-definition on steroids, it leverages both static analysis and GitHub Copilot’s neural reasoning to make sure developers get the background they need about the code they touch.

What's it for?
Giving developers the context they need
Stage
WIP
Who made it?
Share

We’re exploring ways to provide developers with a 360-degree view of the most relevant aspects of the code they’re currently editing.

When you’re editing a real system, your location usually connects densely to many other parts of the codebase, and which one is supplying the crucial info you need right now is hard to find. Scrolling, scanning, and tools like jump-to-definition take time, disturb your flow, and still often miss crucial details.

But if you’re working with GitHub Copilot as an AI pair programmer, it already regularly scans the code and assesses what’s relevant — it must, to make targeted suggestions. We leverage its insights to create a continuously updating "radar" view of the snippets most relevant to your current location.

How does it work?

From a technical perspective, this is based on GitHub Copilot’s attention mechanism: At each position in the code, GitHub Copilot builds up a separate state that encodes its understanding of the code at that position.

The states are built up iteratively: each position in the code continues to query previous states at other positions for interim results. Tracing the pattern of those queries shows us what GitHub Copilot considers the most relevant parts of the code.

What does that have to do with actual developers? Rather a lot, as it turns out! GitHub Copilot’s idea of what’s the relevant background for a particular snippet of code isn’t so different from real developers’.

We asked developers to explore some code and complete a task while closely tracking their eye movements. We checked where they would go next, after their currently focused position. Then we figured out how to best interpret the neural network attention to build a guide for likely relevant positions in the code.

That research forms the basis of our processing of the attention signal. Dig into the details on arXiv.

Comparison of different ways to predict the next line a given developer looks
at

Comparison of different ways to predict the next line a given developer looks at.

How does it look?

GitHub Copilot Radar lives in a side panel in Visual Studio Code, highlighting your current cursor position and other relevant code snippets. Each code snippet is highlighted with an indicator representing the type of relevance.

Snippets that Copilot thinks are relevant are highlighted with the Copilot logo — for example, the central part of the documentation.

What’s next?

This is an early prototype, and there are several directions we want to pursue before we make it publicly available. Next up...

  • We want to find a way to include more static analysis sources in the radar screen without cluttering it.

  • We want to add contents of other files, since they are often the ones most difficult to discover.

  • The attention mechanism only looks backwards, but it is possible to process it in a way that gives information about subsequent lines of code. We want to complement this static analysis signals like calls, references, or tests.