Software.com Changelog logo
Back to Homepage

Changelog

See the latest updates, features, and releases from the team at Software.com.

Subscribe to Updates

Labels

  • All Posts
  • release
  • Fix
  • Announcement
  • Improvement
  • survey

Jump to Month

  • May 2025
  • April 2025
  • February 2025
  • January 2025
  • November 2024
  • October 2024
  • July 2024
  • June 2024
  • January 2024
  • October 2023
  • March 2023
  • February 2023
  • December 2022
  • September 2022
  • August 2022
  • July 2022
  • December 2021
  • November 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • January 2021
  • December 2020
Changelog byAnnounceKit

Create yours, for free!

Fix
4 days ago

📣 Deployment Frequency Fix

We recently identified and addressed a discrepancy in our Deployment Frequency calculation on the KPIs page. Over the last week (approximately May 20th - May 27th), the metric did not include deployments that exclusively contained pull requests authored by excluded users and bots. This behavior has been changed, providing an accurate count of deployments.

Avatar of authorBrett Stevens
Improvement
a week ago

🚀 Git User Improvements

We’ve made several improvements to make it more clear how we track Git users and added support for excluding pull requests created by specific users. 

Git Users include anyone in your GitHub, GitLab, Bitbucket, or Azure DevOps organization. You can view a list of Git Users in your organization’s Settings under the Git Users tab.

Git Users include both Current and Past users contributing to repositories in your organization. Past Git Users are included in your historical data.

You can manually exclude pull requests created by managers, product managers, designers, and other Git users outside of engineering. Reviews and approvals from these users will still count towards the progress of your pull requests to preserve accuracy (e.g. calculating Time to Review), but they will not factor into contributor-based metrics (e.g., Developer, Reviewer, and Contributor counts).

Pull requests created by bots are automatically excluded from your data. Their reviews and other pull request activity are not factored into any metrics. If we are unable to automatically detect bots in your Git organization, you can manually exclude all pull request activity from those users.

Open source contributors are also automatically excluded from your data. An open source contributor is any user that is not an organization member and only contributes to public repositories.

If you have any questions, please reach out to us at support@software.com.

Avatar of authorBrett Stevens
Improvement
a week ago

✨ Excluding Drafts from Lead Time

To ensure your lead time metrics reflect work that is actively in the review and merge pipeline, we've refined how draft pull requests are handled. All stages, including "Time to Approve," now consistently exclude drafts, giving you a cleaner signal on the efficiency of your active development efforts (note: there is no change to how “Time to Review” is calculated, which still starts when a pull request is opened or marked as ready for review).

These updates are designed to give you a truer, more actionable understanding of your team's development velocity. If you have any questions, please reach out to us at support@software.com.

Avatar of authorGeoff
release
a month ago

🧭 Leading Indicators

We’ve released Leading Indicators—actionable data to help you identify and unblock your top constraint.

Software.com is divided into KPIs and Leading Indicators. Whereas KPIs are lagging metrics that focus on results and outcomes, leading indicators focus on inputs and actions, predicting future performance and trends.

Leading Indicators are designed to be actionable, helping you quickly pinpoint the biggest constraint in your development process. We highlight one high-impact area at a time, making it easy for you to take action without being buried in dashboards or noise.

Constraints vary by team and shift over time. Schedule time with us to review and identify your top constraint together, or feel free to reach out to us at support@software.com.

Avatar of authorGeoff
release
a month ago

🚀 New KPIs Report

Our new KPIs report automates the most important metrics for development teams, such as New Deliveries, Rework (vs. New), Deployments, and Lead Time.

The KPIs report features several notable improvements vs. our previous reports:

  • Improved terminology: we’ve updated the name of the “Features” metric to “New Deliveries” to avoid confusion with project management terminology. This change is purely naming-related and does not impact how the metric is calculated or displayed.
  • Optimal values: based on data for over 700K developers and 10K companies, we’ve added optimal values for New Deliveries per Developer, Rework, and Lead Time.
  • Deployments: if you’ve configured deployment methods for your repositories, you will see Deployments per Developer on your KPI report.
  • Improved filtering controls: you can filter down to any group or any time period. You can also adjust the granularity to weekly, monthly, or quarterly or view as a table instead of graphs for easier reporting.

If you have any questions, feel free to reach out to us at support@software.com.

Avatar of authorGeoff
Improvement
3 months ago

⚡ Improved terminology

We’ve updated the name of the “Features” metric to “New Deliveries” to avoid confusion with project management terminology. This change is purely naming-related and does not impact how the metric is calculated or displayed.

Avatar of authorBrett Stevens
Improvement
3 months ago

📐 Improved work breakdown algorithms

We’re making an adjustment to improve the accuracy of our data — specifically, how we track the amount of code that is New, Churn, and Refactor. These updates provide a more accurate view of changes over time, especially for teams that ship new code iteratively.

You may see adjustments to your metrics and benchmarks—most notably, an increase in New Units per Developer (formally Features per Developer) and a higher percentage of pull requests categorized as New. Since these improvements apply to historical data as well, your data will be uniformly adjusted upward, maintaining all relative differences and trends.

By tracking a more detailed history of file line changes—including both their original position and where they appear after edits—we are now more precisely differentiating between newly written code and modifications to existing work. This added granularity gives a more refined view of how code evolves over time, ensuring that new work is accurately captured and modifications are correctly attributed.

You can learn more about how pull requests are categorized in our documentation. If you have any questions, feel free to reach out to us at support@software.com.

Avatar of authorGeoff
Improvement
4 months ago

➗ Improved algorithms

We’ve updated our algorithms to more precisely track your lead time and productivity metrics.

  1. Lead Time Stages

    Lead Time is incrementally updated when a PR completes a stage, rather than waiting for the PR to complete all stages. For instance:

    • Previously, Time to Review was updated after a PR was merged.
    • Now, Time to Review is updated when a PR is reviewed.
    • This change ensures that your Lead Time metrics are more timely and reflect an ongoing historical trend.

  2. Excluded Pull Requests

    Our algorithm automatically excludes pull requests that are not representative of development productivity, such as those authored by bots and backmerges from main. We also exclude pull requests that are part of your release process (e.g. promoting changes from a release branch to main). We’ve refined our algorithm to more precisely detect these pull requests, such as by detecting re-used branches. Depending on your development process, you may notice a slight increase in your pull request metrics.

If you have any questions about how our algorithms work, feel free to reach out to us at support@software.com.

Avatar of authorBrett Stevens
release
6 months ago

⌚ Time-based comparisons

We’ve released Time-Based Comparisons, which let you compare productivity before/after a key event. Examples of major events that you can measure with this feature include acquisitions, layoffs, leadership changes, and adopting new tools.

To get started, go to the Comparisons tab and click Create Comparison. When setting up the comparison, select Before vs. after an event, which will allow you to define a new key event and select the groups that you want to compare.

If you have any questions or would like help setting up a time-based comparison, feel free to reach out to us at support@software.com.

Avatar of authorBrett Stevens
Improvement
8 months ago

⚙️ Improvement to developer counts for comparisons, benchmarks

Based on feedback from our community, we’ve improved our Features per Developer metric to better standardize industry benchmarks and provide better accuracy when comparing groups within your organization, such as developers using vs. not using GitHub Copilot.

As part of this update, the developer count will include contributors who created a pull request in the last 90 days. Previously, the developer count included any contributor who created or reviewed a pull request in the last 30 days.

Extending the activity window from 30 to 90 days accounts for seasonality, providing a more stable representation of your organization’s size over time. Developers who leave your organization are counted towards historical activity, but immediately removed from your developer count going forward.

Cost per feature still includes contributors who created or reviewed code in the last 90 days—to more precisely capture the costs associated with developing, reviewing, and delivering features. Development costs are calculated by multiplying a group’s cost per contributor by the count of developers and reviewers within the group.

You may notice small changes to your metrics and global benchmarks. You can read more about this change in our documentation, or feel free to reach out to us with any questions at support@software.com.

Avatar of authorGeoff