How LSports Went from 7% to 70% Unit Test Coverage while Strengthening AI Code Resilience
The sports data provider partnered with Codacy to build a quality-first engineering culture, using quality gates, IDE-level feedback, and AI guardrails to protect their codebase as they scale.
- 10x increase in unit test coverage (from under 7% to 70%)
- 800 core repositories standardized under unified quality gates
- Zero new critical security issues introduced over two years
"Codacy is kind of a solid rock. It's in our foundation. It protects us from dropping the maturity that we've reached. Each and every new repo already starts with GitHub status checks and quality gates that don't allow them to start without unit tests."
— Ronen Yurik, DevEx Director, LSports
About LSports
LSports is a world-leading provider of real-time sports data, serving sportsbooks, media organizations, and fan experience innovators globally. Founded in 2012 and headquartered in Ashkelon, Israel, the company powers betting experiences across 100+ sports, 15,000 leagues, and 2,500 markets using proprietary API technology. LSports processes over 175,000 pre-match and 100,000 in-play events each month, delivering live scores, odds, statistics, and settlements to partners worldwide.
Challenge
No quality practices, no testing, and no visibility
When Ronen Yurik joined LSports as DevEx Director, he inherited a codebase where quality had not been a priority. Production issues were common, customer-facing bugs were frequent, and the engineering organization lacked standardized practices across its approximately 1,000 repositories.
His first step was mapping the landscape. What he found was stark: across the 800 core repositories that mattered most to the business, unit test coverage averaged less than 7%.
"When I came in, quality was not a thing at LSports at all," Ronen explained. "People didn't really pay attention to that. A lot of issues raised from production, from customers. I kind of mapped the territory, what we've got and what we don't have, where the gaps are. And what I saw is that besides no end-to-end and integration testing, there was no unit testing at all."
LSports had been using SonarCloud. Daniel Netzer, now CPTO, led the evaluation that resulted in needing a switch. The decision came down to three factors:
- SonarCloud required compiling C# code before scanning, which added extra steps to their CI pipeline.
- SonarCloud charged by lines of code, making costs unpredictable as the codebase grew.
- SonarCloud was difficult to set up and maintain at scale.
LSports needed a tool that was easy to implement, easy to roll out, and wouldn't create additional overhead for their engineering team.
And the challenge extended beyond tooling. LSports operates in real-time sports data, where a single undetected defect in an API or trading data pipeline could have immediate, widespread impact on clients during a live sporting event. The company needed visibility into code quality across hundreds of services, enforcement mechanisms that would scale, and a way to prevent the introduction of new problems while addressing the existing backlog.
Solution
How LSports built quality infrastructure with Codacy
Ronen partnered with Codacy's customer success team to build a structured implementation plan. Working closely with Amy Hunt, Codacy’s Customer Success Team Lead, he established quarterly business reviews to set goals and track progress. The Codacy team conducted training sessions on creating quality gates and configuring rules for each programming language in LSports' stack.
Rather than accepting default configurations, Ronen worked with LSports' architects to define what quality meant for their specific context. Codacy's flexibility allowed them to start with recommended settings, then customize policies to match their needs. What began as four policies (excluding defaults) evolved into multiple policies per language, per team, giving groups flexibility while maintaining baseline coding, testing, and security standards.
"Codacy was very flexible and helped us define what the guidelines are from our side," Ronen said. "We started with the recommendations and all the general stuff, but then we optimized it to our needs. We allow flexibility as long as the quality gate is not breached."
Enforcing standards through gates and internal competition
The implementation took a two-pronged approach.
For new repositories, LSports implemented GitHub status checks and quality gates from day one, preventing any new service from launching without unit tests. For existing repositories, Ronen created an internal competition, publishing each team's progress monthly to drive adoption through visibility and friendly rivalry.
Security enforcement became part of the merge gate as well, blocking the introduction of new critical security issues at the Pull Request level. Codacy mapped their existing security backlog, allowing them to address legacy issues like SAST violations, exposed secrets, and insecure dependencies systematically while preventing new ones from entering the codebase.
"We blocked the ability to introduce new security issues for the past two years," Ronen noted. "We have a plan to address the backlog going forward. I see it as a positive: first closing the gate, and then we will deal with the animals we have inside."
Shifting feedback earlier with IDE integration and AI guardrails
As Codacy released new capabilities, LSports adopted them to shift-left Codacy feedback earlier in the development cycle. The IDE plugin and MCP (Model Context Protocol) integration brought Codacy's analysis directly into developers' IDEs and copilot chat panels, providing instant feedback and fix automation before code ever reached version control.
"The ability to get the feedback in the IDE is a very dramatic change," Ronen explained. "Developers can get feedback instantly before they're merging or creating a PR, with accurate suggestions on how to resolve and address those points. It really saves time. They don't need to wait for the entire cycle until the code gets to GitHub and then gets feedback. It's almost immediate."
This became especially valuable as LSports embraced AI-assisted development. The engineering team observed increasing code volumes generated by AI tools, creating new review challenges and causing code review fatigue. Codacy Guardrails provided a critical safety layer, securing AI-generated code in real-time within the IDE and serving as a backstop at the PR level.
"We are seeing an increase in code lines from AI generation," Ronen said. "Having as many guardrails as possible in each and every layer, whether it's the IDE or the PR, is essential. We're using both the IDE plugin for immediate feedback, and if people ignore that, we still have the gate at the PR level."
Connecting Codacy to their internal developer portal
To tie quality metrics into broader engineering standards, LSports integrated Codacy data into their internal developer portal (IDP) through Port. They built a deployment maturity model where the Codacy score and code coverage became prerequisites for teams to achieve certification levels.
"If you want to be at bronze level, which is one above the baseline, you need to have a high Codacy score and high coverage," Ronen explained. "Codacy was the first thing we integrated into Port. It helped us align and understand the full picture of where our organization's maturity is from a quality standpoint."
The integration was straightforward. Codacy's detailed API allowed LSports to pull any metric they needed into their portal without friction.
"Codacy has a very easy and simple API, but also very detailed. You can pull almost everything. The integration was very fluent and easy. I don't recall that we had any issues."
Results
10x coverage increase and zero new security issues in two years
Over approximately two years of structured implementation, LSports achieved tangible results:
- 10x increase in unit test coverage (from under 7% to 70%)
- 800 core repositories standardized under unified quality gates
- Zero new critical security issues introduced over two years
"From less than 7% unit test coverage to almost 70% across the organization is amazing from our perspective," Ronen said. "We didn't expect that level of improvement."
Plus, the engineering team at LSports scaled from 60 to 150 Codacy seats (2.5x growth) while maintaining quality standards.
The metrics tell part of the story, but the cultural shift matters equally. Quality is now embedded in LSports' engineering DNA. New repositories cannot launch without meeting baseline standards. Teams compete on quality metrics. Developers receive feedback in their IDE before code ever reaches review.
Looking ahead, LSports plans to expand its use of mutation testing, using AI-assisted workflows to identify gaps in test effectiveness beyond simple coverage metrics. They also intend to address their security backlog systematically, building on the foundation of two years of preventing new issues.
"Despite the increase in code lines from AI generation, quality metrics like production incidents and customer bugs are stable. That suggests our current guardrails and education phase are effective. Codacy protects us from dropping the maturity that we've reached."