-
Notifications
You must be signed in to change notification settings - Fork 4
[Project] Software Carbon Efficiency Rating (SCER) #44
Comments
WG: WG approves project. Start may be delayed due to leave. |
@tmcclell / @seanmcilroy29 to be clear is this issue closed or open? Has it been approved? |
@jawache - The Standards WG has approved this and has now been submitted for the OC to review and approve |
Thanks will add an agenda item to the next meeting to discuss clarification on the process here. The intention wasn't to add months to the startup time for an incubation project and take decision making out of the hands of the WG. More to give the OC a chance to object than to approve. We should also move the objection async, the OC meets infrequently. |
Project Approved |
Working Title: Software Carbon Efficiency Rating (SCER)
Related issues or discussions:
Comparable SCI Benchmark Proposal sci#359 (collection of issues that discussed this idea)
Previous discussions on Standards-WG: https://docs.google.com/document/d/1rCNbwKiegUorrtuBuw-Ywdum1rHBWfAL-gsaOmGCcpc/edit#heading=h.3skgaml8cvuo
Tagline: A consumer-friendly rating for software carbon intensity
Abstract:
This project aims to develop a standard for a benchmark platform and test metrics for evaluating the carbon efficiency of software. The initiative will provide comparable scores for software with the same functionality, informing procurement decisions and potentially shaping regulations similar to ENERGY STAR or EPA Fuel Economy Ratings.
Quote:
Audience:
This standard is targeted toward individuals who use software either as a standalone product or as a service. By providing a rating system for carbon efficiency, consumers will be able to make informed decisions when choosing between similar offerings, such as music streaming services. Large organizations' procurement departments will also benefit from this rating system when making decisions about service contracts or software purchases. Additionally, software companies can use this rating system as a marketing tool to compete based on the efficiency of their offerings.
Governance: Which working group(s) do you think should govern this project?
Problem:
There are no broadly accepted standards that implement a consumer-friendly rating for software carbon intensity. In Germany, the Blue Angels - DE-UZ215 Resources and Energy Efficient Software Products is the closest proposed. But this standard lacks adoption and does not address software other than desktop ones with graphical user interfaces.
Possible strategies and approaches were proposed in academic research, for example, Sustainable software products - Towards assessment criteria for resource and energy efficiency. This indicates a growing interest in software ratings that address sustainability.
Our proposed Software Carbon Efficiency Rating (SCER) presents a solution by establishing a comprehensive, globally applicable standard. SCER focuses on software's core functionality across various platforms, not just desktop ones. It enables comparability, aiding consumers and corporations alike in making environmentally-conscious decisions. By creating a controlled testing environment, SCER ensures fair competition among software providers and promotes transparency and accountability in the software industry. Ultimately, SCER is the next step in harmonizing digital innovation and environmental sustainability.
Solution: Try to make this as detailed as possible. The topics given below are just suggestions; address them only if they are relevant to your problem:
Our solution builds upon Aveva's research, which was presented in the Standards-WG. Aveva developed a test platform using standard hardware to assess various energy consumption levels for different software configurations. Our belief is that we can establish specific parameters, to a certain degree, for software of a particular category (e.g., databases) to make them comparable on this platform.
We'll define standard workloads for different categories of software and measure them accordingly. This involves categorizing software and setting standards based on the functional unit of the SCI standard. For instance, for a music streaming service, the functional unit could be "per non-cached minute of streamed music". However, our challenge is to develop guidelines that help us make informed decisions without being overwhelmed by the complexities of modern software architecture.
Closure:
The project's success can be determined by how widely accepted the standard is among consumers and technology companies. To measure consumer adoption, third-party entities can use our guidelines to evaluate software and provide scores. This third party can also create a business by offering these evaluation services, which could be seen as a measure of success. Additionally, if technology companies use their scores in marketing campaigns, it could also be considered a success for the project.
The text was updated successfully, but these errors were encountered: