How Does the JRC Approach to Repairability Scoring Differ from iFixit’s?

How Does the JRC Approach to Repairability Scoring Differ from iFixit’s?

When it comes to repairability scoring, iFixit isn’t the only game in town anymore. In addition to the French Repairability Index, the European Commission’s Joint Research Center (JRC) also provides a set of criteria and a rubric for scoring the repairability of many device types, including consumer electronics. It’s likely that the JRC method will become the norm for smartphone scoring in Europe in 2023, and thus will have an important role in how we discuss repairability.  

But not all repairability scores are created equal. So what’s the difference? Let’s compare iFixit’s scoring method with the JRC repairability scoring method for smartphones and tablets. 


iFixit’s scoring method and the JRC’s method have similar goals: both aim to help consumers make informed purchasing decisions. At iFixit, we inform people on how quickly and easily an end user could repair their own device at home, and our scores also help manufacturers in their efforts to design more repairable products. In contrast, the goal of the JRC’s scoring method is to assess to what degree the European ecodesign requirements for smartphones and tablets are met.

The Scoop on Scope

As always, scoring repairability is complex—there’s a lot to consider, and more than one valid approach. The iFixit and JRC scoring methods differ in scope: each considers different repairability aspects, priority parts, repair agents, and timelines in calculating a score. So while both scores represent overall product repairability, they factor in different elements along the way.

In terms of repair aspects, both methods factor in the product design, the provision of spare parts, and repair information. The JRC method also includes the availability of software updates over time—which, at the time of this writing, iFixit’s score does not. 

In both methods, products are assessed based on their priority parts. These are components with a high likelihood of failure throughout their lifespan, which are critical to the functionality of the product. Both methods consider the following priority parts for smartphones: the display assembly, battery, back cover or enclosure, front and rear cameras, external charging port, and mechanical buttons. For foldable phones, the JRC also considers the hinge assembly a priority part. These differences result in different weights—each system weights the priority components differently—and the greater the number of priority parts assessed, the less each individual part contributes to the final score. 

The repair agents—that is, who performs the repair—considered in each scoring system overlap, but are slightly different. At iFixit, we evaluate the repair ecosystem available for self-repairers, on the assumption that DIY repair enables all levels of repair (in-house technicians, authorized service providers, independent repair shops, etc.). In contrast, the JRC method explicitly considers both consumers and professional repairers.

One final notable difference in scope is the timeline. Both iFixit and JRC use data available at the time of the analysis. Additionally, the JRC considers the manufacturer’s commitment to make information, spare parts, and software updates available over time. Manufacturers must attest to the availability of information, spare parts, and software updates that align with the minimum requirements (e.g. five years for all priority parts in smartphones). At iFixit, seeing is believing, so we try to verify what’s actually available to us and other would-be DIYers. When the situation is in flux, such as near a new product launch, we sometimes award provisional credit for OEMs who have a strong track record of making parts and information available, or who have made credible commitments to do so—but the score isn’t final until we can order a part and complete the repair ourselves.  

The Devils in the Details—Criteria, Subcriteria, and Weights

At a high level, the iFixit and JRC methods are somewhat similar in that they both stem from the European standard EN45554, which provides a framework for assessing repairability. However, the scoring rubrics—including the weights of each criteria—are different. iFixit’s aggregated score is a number from 0 to 10, where 10 signifies the best score and an easy repair. The JRC score is different, ranging from 1 to 5—where 1 signifies that the minimum requirements are met, and 5 means the requirements are exceeded. Figure 1 depicts the criteria and weighting for each scoring method. 

iFixit vs. JRC
Figure 1 – Visual overview of iFixit’s and JRC’s scores, main criteria, and subcriteria

Product Design 

Product design accounts for 80% of iFixit’s score. It’s assessed by disassembling and reassembling the product to replace each priority part, while recording average time intervals—known as proxy times—associated with each action required along the way.. Each proxy time is multiplied by a correction factor for the type of tools required, and more time is added for each tool change needed. The resulting times are weighted and scaled for each priority part, and then combined with other factors in the overall score.

The JRC method assigns a weight of 55% to product design, which is further divided into three subcriteria: disassembly depth (25%), type of fasteners (15%), and the type of tools needed (15%) for each priority part. Disassembly depth is measured by the number of “steps” to reach the priority part, and a score is assigned based on the number of steps. The highest rating, a score of 5, is assigned when there is only one step. More than 15 steps would score 1 point. The types of fasteners are categorized by whether they are reusable (5 points) or only removable (1 point). Lastly, the types of tools needed to disassemble the product to access the priority part are sorted and scored according to specific categories, ranging from no tools (5 points), to basic tools, tools supplied with replacement parts, tools supplied with the product, and commercially available tools (1 point).

Spare Part Availability

Accessing the broken component is only half of a repair—to be successful, you also have to be able to install a compatible replacement. For a full score for spare parts availability, iFixit looks for parts offered for sale to the public on the manufacturer’s support website (or for a link where OEM parts may be purchased elsewhere). Parts availability accounts for 10% of the iFixit score. The highest score is awarded when replacements for all priority parts are on sale at the time of analysis, with no individual part costing more than 25% of the retail price of the device. The lowest score is assigned when none are available. 

The JRC method takes a slightly different approach, assigning the lowest score (1 point) to the minimum ecodesign requirement, which is that all priority parts are available to professional repairers and the display assembly is also available to end users. The score increases in tiers based on additional parts being available to all users. If the display and battery can be purchased by end users, then the score is 2 points. If the back cover is also available, it goes to 3 points. When replacement parts for cameras can also be purchased, 4 points. When all priority parts are available to both end users and professional repairers, the score is 5 points. Spare parts need to be available for 5 years (6 years for tablets) after placing the last unit of the model in the market, and are required to be delivered within 5 working days. Manufacturers are also requested to disclose the pricing of the spare parts, and cannot raise the price of parts after placing the product on the market. 

Repair Information

Another 10% of the iFixit score goes to repair information and documentation. iFixit assesses whether repair information is free and publicly available on the manufacturer’s support website. Repair documentation should contain an exploded view of the product, a list of numbered or compatible parts, and repair instructions for each priority part. iFixit also checks for tools lists, troubleshooting information, and schematics. The highest score is assigned when all information is complete, and the lowest when no information is publicly and freely available. 

The JRC scores repair information based on the number of repair agents to which it is available, the cost, and the content. Repair information is expected to be available for 7 years after the last model is on the market. A score of 1 is assigned when repair information is available at a “reasonable” price to registered professional repairers. A score of 3 means the information is available to registered professional repairers for free. The highest score, 5, is assigned when repair information is free for both professional repairers and end users.  

The JRC system also requires a list of necessary repair and test equipment, diagnostic fault and error information (including manufacturer-specific codes), component and diagnosis information (including min and max theoretical values for measurements), instructions for software and firmware, information on how to access data records of reported failures, procedure for user authorisation of parts replacement when required for a repair, and the software tools, and firmware.

Other Factors

Repair is a way to extend longevity, giving your device a new lease on life—but it may still be short-lived if software support dries up prematurely. So even though it’s not strictly part of the repair process, the JRC also checks the duration of software updates from the manufacturer—both for functionality (i.e. new OS and feature updates) and for security. The JRC’s minimum requirement is for 5 years of security updates, and 3 years of functionality updates. The maximum score is reserved for devices with security updates for at least 7 years and functionality updates for 6 years. iFixit doesn’t currently factor in software updates when calculating repairability scores, but it’s an area of active interest and we may incorporate a scoring method for this in the future.

One recent major addition to the iFixit scoring rubric is for parts pairing and calibration issues. Ideally, installing an identical replacement part should be enough to complete your repair. But if it doesn’t work correctly because the manufacturer withholds the tools needed to initialize, pair, or calibrate the new part, it’s often the same as no repair at all. Devices that exhibit no parts pairing issues in iFixit’s tests will not see their scores affected—but the penalty can be severe if multiple repairs are hobbled by these kinds of problems.

Conclusion: It’s Complicated 

The iFixit and the JRC scoring methods share many aspects, but are two different systems. They both stem from the EN45554 standard, and they both consider product design, the availability of spare parts, and repair information as key aspects in assessing the repairability of a device. Both are great tools for evaluating a product’s repairability. However, their goals, scope, criteria, and emphasis (weights) differ. They also differ in how they actually implement the calculation of the score.  And they aggregate into different ranges, presenting different scales for the consumer to consider: iFixit’s ranges from 0 to 10, with 10 being best or easiest to repair, while the JRC scores from 1 to 5, where the lowest signifies that only the minimum requirements laid out in the ecodesign regulation are met.

TL;DR: while iFixit and the JRC both score repairability, the scores and results are not comparable between methods.  They serve different purposes.

iFixit’s scoring method, with its greater emphasis on product design and DIY repair, may be more informative both for product designers and for consumers who are interested in purchasing hardware they can potentially fix themselves. While the JRC method produces scores in the range of 1-5, iFixit’s 0-10 scale provides additional granularity that may better help consumers evaluate roughly similar devices when making purchasing decisions. On the other hand, the JRC method considers some additional criteria such as duration of software and security updates that may be highly relevant for long-term purchasing decisions, which are not currently folded into iFixit’s repairability scoring methodology (but could certainly be considered in the future). The JRC method also considers at least some repair resources that are only available to professional repair technicians, which is outside the scope of iFixit’s DIY-centered scoring methodology.

The pros and cons of each method could be debated indefinitely, but ultimately, both are useful tools for learning about how repairable a device is. Like all complex systems, there is no perfect way to capture all the elements of repair in one system. These systems continue to evolve—stay tuned for updates—and refine how they address repair topics and distill that information for public awareness. 

We’ve made it our ongoing mission to educate consumers around repairability and to empower them with tools and information whenever we can. As our knowledge and experience grows, we’ll continue to share what we learn to help you make more repairable choices and fix the world.