From a6427409760b0cbc0136a83dba462400c30c9536 Mon Sep 17 00:00:00 2001 From: Vijay Janapa Reddi Date: Fri, 15 Dec 2023 12:16:39 -0500 Subject: [PATCH] Fixed the issue when >2 references show up cause of , instead of ; sep. --- contents/benchmarking/benchmarking.qmd | 2 +- contents/sustainable_ai/sustainable_ai.qmd | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/contents/benchmarking/benchmarking.qmd b/contents/benchmarking/benchmarking.qmd index 2a149fbd4..8df0adb6a 100644 --- a/contents/benchmarking/benchmarking.qmd +++ b/contents/benchmarking/benchmarking.qmd @@ -770,7 +770,7 @@ While this integrated perspective represents an emerging trend, the field has mu Emerging technologies can be particularly challenging to design benchmarks for given their significant differences from existing techniques. Standard benchmarks used for existing technologies may not highlight the key features of the new approach, while completely new benchmarks may be seen as contrived to favor the emerging technology over others, or yet may be so different from existing benchmarks that they cannot be understood and lose insightful value. Thus, benchmarks for emerging technologies must balance around fairness, applicability, and ease of comparison with existing benchmarks. -An example emerging technology where benchmarking has proven to be especially difficult is in [Neuromorphic Computing](@sec-neuromorphic). Using the brain as a source of inspiration for scalable, robust, and energy-efficient general intelligence, neuromorphic computing [@schuman2022] directly incorporates biologically realistic mechanisms in both computing algorithms and hardware, such as spiking neural networks [@maass1997networks] and non-von Neumann architectures for executing them [@davies2018loihi, @modha2023neural]. From a full-stack perspective of models, training techniques, and hardware systems, neuromorphic computing differs from conventional hardware and AI, thus there is a key challenge towards developing benchmarks which are fair and useful for guiding the technology. +An example emerging technology where benchmarking has proven to be especially difficult is in [Neuromorphic Computing](@sec-neuromorphic). Using the brain as a source of inspiration for scalable, robust, and energy-efficient general intelligence, neuromorphic computing [@schuman2022] directly incorporates biologically realistic mechanisms in both computing algorithms and hardware, such as spiking neural networks [@maass1997networks] and non-von Neumann architectures for executing them [@davies2018loihi; @modha2023neural]. From a full-stack perspective of models, training techniques, and hardware systems, neuromorphic computing differs from conventional hardware and AI, thus there is a key challenge towards developing benchmarks which are fair and useful for guiding the technology. An ongoing initiative towards developing standard neuromorphic benchmarks is NeuroBench [@yik2023neurobench]. In order to suitably benchmark neuromorphics, NeuroBench follows high-level principles of _inclusiveness_ through task and metric applicability to both neuromorphic and non-neuromorphic solutions, _actionability_ of implementation using common tooling, and _iterative_ updates to continue to ensure relevance as the field rapidly grows. NeuroBench and other benchmarks for emerging technologies provide critical guidance for future techniques which may be necessary as the scaling limits of existing approaches draw nearer. diff --git a/contents/sustainable_ai/sustainable_ai.qmd b/contents/sustainable_ai/sustainable_ai.qmd index 2bfe4188c..4b880cab6 100644 --- a/contents/sustainable_ai/sustainable_ai.qmd +++ b/contents/sustainable_ai/sustainable_ai.qmd @@ -79,7 +79,7 @@ The DeepMind team leveraged Google's extensive historical sensor data detailing ### Understanding Energy Needs {#understanding-energy-needs} -In the rapidly evolving field of AI, understanding the energy needs for training and operating AI models is crucial. With AI entering widespread use in many new fields [@ai_health_rise, @data_centers_wheels], the demand for AI enabled devices and data centers is expected to explode. This understanding helps us grasp why AI, particularly deep learning, is often labeled as energy-intensive. +In the rapidly evolving field of AI, understanding the energy needs for training and operating AI models is crucial. With AI entering widespread use in many new fields [@ai_health_rise; @data_centers_wheels], the demand for AI enabled devices and data centers is expected to explode. This understanding helps us grasp why AI, particularly deep learning, is often labeled as energy-intensive. #### Energy Requirements for AI Training {#energy-requirements-for-ai-training}