Rongchai Wang
Jan 17, 2026 09:16
To enhance system stability and prevent performance degradation, GitHub has introduced a rate limit of 200 uploads per minute per repository for Actions cache entries, addressing concerns related to high-volume uploads and cache thrashing.
GitHub has implemented a new rate limit on its Actions cache system, capping uploads at 200 new cache entries per minute for each repository. This change, announced on January 16, 2026, aims to mitigate the impact of high-volume uploads on system stability, particularly for repositories that were previously hammering the cache system with rapid-fire uploads.
It’s worth noting that downloads remain unaffected by this change. If your workflows are designed to pull existing cache entries, you won’t notice any differences. The limit specifically targets the creation of new entries, which is an important distinction for teams running parallel builds that generate fresh cache data, often utilizing continuous integration and continuous deployment (CI/CD) pipelines.
The primary reason for this change is to prevent “cache thrashing,” which occurs when repositories upload massive volumes of cache entries in short bursts, degrading performance for all users on the shared infrastructure. By introducing a 200-per-minute cap, GitHub provides heavy users with sufficient headroom for legitimate use cases while preventing abuse that could destabilize the system. This move is also expected to improve the overall cloud computing experience for GitHub users.
Part of a Broader Actions Overhaul
This rate limit is part of a series of significant changes to GitHub Actions economics. Earlier this month, GitHub reduced pricing on hosted runners by 15% to 39%, depending on size. However, the more significant news is the introduction of a new charge for self-hosted runner usage in private repositories, starting at $0.002 per minute, effective March 1, 2026. This change is prompting some teams to reassess their CI/CD architecture and consider alternatives, such as hybrid cloud solutions.
In late 2025, the cache system itself underwent an upgrade, allowing repositories to exceed the previous 10 GB limit through pay-as-you-go pricing. While every repository still receives 10 GB of free storage, heavy users can now purchase additional storage, eliminating the need to constantly manage eviction policies and ensuring a more seamless DevOps experience.
What Teams Should Check
Most workflows won’t be affected by this limit. However, if you’re running matrix builds that generate unique cache keys across dozens of parallel jobs, it’s essential to assess your workflow’s potential impact. For instance, a 50-job matrix completing simultaneously could theoretically hit 200 cache uploads in under a minute if each job creates multiple entries, potentially leading to bottlenecks in your CI/CD pipeline.
To avoid hitting the limit, teams can consolidate cache keys where possible or stagger job completion. Although GitHub hasn’t announced a monitoring dashboard for cache upload rates, concerned teams will need to manually audit their workflow logs to ensure compliance with the new rate limit, leveraging tools like log analysis to optimize their workflows.
Image source: Shutterstock







































