Skip to content

Commit

Permalink
Fixed the issue when >2 references show up cause of , instead of ; sep.
Browse files Browse the repository at this point in the history
  • Loading branch information
profvjreddi committed Dec 15, 2023
1 parent 2269db9 commit a642740
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion contents/benchmarking/benchmarking.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -770,7 +770,7 @@ While this integrated perspective represents an emerging trend, the field has mu

Emerging technologies can be particularly challenging to design benchmarks for given their significant differences from existing techniques. Standard benchmarks used for existing technologies may not highlight the key features of the new approach, while completely new benchmarks may be seen as contrived to favor the emerging technology over others, or yet may be so different from existing benchmarks that they cannot be understood and lose insightful value. Thus, benchmarks for emerging technologies must balance around fairness, applicability, and ease of comparison with existing benchmarks.

An example emerging technology where benchmarking has proven to be especially difficult is in [Neuromorphic Computing](@sec-neuromorphic). Using the brain as a source of inspiration for scalable, robust, and energy-efficient general intelligence, neuromorphic computing [@schuman2022] directly incorporates biologically realistic mechanisms in both computing algorithms and hardware, such as spiking neural networks [@maass1997networks] and non-von Neumann architectures for executing them [@davies2018loihi, @modha2023neural]. From a full-stack perspective of models, training techniques, and hardware systems, neuromorphic computing differs from conventional hardware and AI, thus there is a key challenge towards developing benchmarks which are fair and useful for guiding the technology.
An example emerging technology where benchmarking has proven to be especially difficult is in [Neuromorphic Computing](@sec-neuromorphic). Using the brain as a source of inspiration for scalable, robust, and energy-efficient general intelligence, neuromorphic computing [@schuman2022] directly incorporates biologically realistic mechanisms in both computing algorithms and hardware, such as spiking neural networks [@maass1997networks] and non-von Neumann architectures for executing them [@davies2018loihi; @modha2023neural]. From a full-stack perspective of models, training techniques, and hardware systems, neuromorphic computing differs from conventional hardware and AI, thus there is a key challenge towards developing benchmarks which are fair and useful for guiding the technology.

An ongoing initiative towards developing standard neuromorphic benchmarks is NeuroBench [@yik2023neurobench]. In order to suitably benchmark neuromorphics, NeuroBench follows high-level principles of _inclusiveness_ through task and metric applicability to both neuromorphic and non-neuromorphic solutions, _actionability_ of implementation using common tooling, and _iterative_ updates to continue to ensure relevance as the field rapidly grows. NeuroBench and other benchmarks for emerging technologies provide critical guidance for future techniques which may be necessary as the scaling limits of existing approaches draw nearer.

Expand Down
2 changes: 1 addition & 1 deletion contents/sustainable_ai/sustainable_ai.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ The DeepMind team leveraged Google's extensive historical sensor data detailing

### Understanding Energy Needs {#understanding-energy-needs}

In the rapidly evolving field of AI, understanding the energy needs for training and operating AI models is crucial. With AI entering widespread use in many new fields [@ai_health_rise, @data_centers_wheels], the demand for AI enabled devices and data centers is expected to explode. This understanding helps us grasp why AI, particularly deep learning, is often labeled as energy-intensive.
In the rapidly evolving field of AI, understanding the energy needs for training and operating AI models is crucial. With AI entering widespread use in many new fields [@ai_health_rise; @data_centers_wheels], the demand for AI enabled devices and data centers is expected to explode. This understanding helps us grasp why AI, particularly deep learning, is often labeled as energy-intensive.

#### Energy Requirements for AI Training {#energy-requirements-for-ai-training}

Expand Down

0 comments on commit a642740

Please sign in to comment.