Skip to content

Commit

Permalink
Merge branch 'dev'
Browse files Browse the repository at this point in the history
  • Loading branch information
profvjreddi committed Jan 3, 2025
2 parents 8750c0d + 9252332 commit 01eee1e
Show file tree
Hide file tree
Showing 9 changed files with 54 additions and 36 deletions.
18 changes: 9 additions & 9 deletions _quarto.yml
Original file line number Diff line number Diff line change
Expand Up @@ -126,10 +126,9 @@ book:
- contents/core/ai_for_good/ai_for_good.qmd
- contents/core/conclusion/conclusion.qmd
- text: "---"
- part: contents/labs/labs.qmd
chapters:
- contents/labs/overview.qmd
- contents/labs/getting_started.qmd
- part: "LABS"
- contents/labs/overview.qmd
- contents/labs/getting_started.qmd
- part: contents/labs/arduino/nicla_vision/nicla_vision.qmd
chapters:
- contents/labs/arduino/nicla_vision/setup/setup.qmd
Expand All @@ -156,9 +155,8 @@ book:
- contents/labs/shared/kws_feature_eng/kws_feature_eng.qmd
- contents/labs/shared/dsp_spectral_features_block/dsp_spectral_features_block.qmd
- text: "---"
- part: REFERENCES
chapters:
- contents/core/references.qmd
- part: "REFERENCES"
- contents/core/references.qmd

bibliography:
- contents/core/introduction/introduction.bib
Expand Down Expand Up @@ -343,15 +341,17 @@ format:
cite-method: citeproc
title-block-style: none
indent: 0px
fontsize: 10pt

fontsize: 9pt
colorlinks: true

reference-location: margin
citation-location: block

fig-caption: true
cap-location: margin
fig-cap-location: margin
tbl-cap-location: margin
tbl-colwidths: auto
hyperrefoptions:
- linktoc=all
- pdfwindowui
Expand Down
2 changes: 1 addition & 1 deletion contents/core/conclusion/conclusion.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ We anticipate a growing emphasis on data curation, labeling, and augmentation te

This data-centric approach will be vital in addressing the challenges of bias, fairness, and generalizability in ML systems. By actively seeking out and incorporating diverse and inclusive datasets, we can develop more robust, equitable, and applicable models for various contexts and populations. Moreover, the emphasis on data will drive advancements in techniques such as data augmentation, where existing datasets are expanded and diversified through data synthesis, translation, and generation. These techniques can help overcome the limitations of small or imbalanced datasets, enabling the development of more accurate and generalizable models.

In recent years, generative AI has taken the field by storm, demonstrating remarkable capabilities in creating realistic images, videos, and text. However, the rise of generative AI also brings new challenges for ML sysatem. Unlike traditional ML systems, generative models often demand more computational resources and pose challenges in terms of scalability and efficiency. Furthermore, evaluating and benchmarking generative models presents difficulties, as traditional metrics used for classification tasks may not be directly applicable. Developing robust evaluation frameworks for generative models is an active area of research, and something we hope to write about soon!
In recent years, generative AI has taken the field by storm, demonstrating remarkable capabilities in creating realistic images, videos, and text. However, the rise of generative AI also brings new challenges for ML systems. Unlike traditional ML systems, generative models often demand more computational resources and pose challenges in terms of scalability and efficiency. Furthermore, evaluating and benchmarking generative models presents difficulties, as traditional metrics used for classification tasks may not be directly applicable. Developing robust evaluation frameworks for generative models is an active area of research, and something we hope to write about soon!

Understanding and addressing these system challenges and ethical considerations will be important in shaping the future of generative AI and its impact on society. As ML practitioners and researchers, we are responsible for advancing the technical capabilities of generative models and developing robust systems and frameworks that can mitigate potential risks and ensure the beneficial application of this powerful technology.

Expand Down
2 changes: 1 addition & 1 deletion contents/core/data_engineering/data_engineering.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@ The dataset cards provide important context on appropriate dataset usage by high

**Data Governance:** With a large amount of data storage, it is also imperative to have policies and practices (i.e., data governance) that help manage data during its life cycle, from acquisition to disposal. Data governance outlines how data is managed and includes making key decisions about data access and control. @fig-governance illustrates the different domains involved in data governance. It involves exercising authority and making decisions concerning data to uphold its quality, ensure compliance, maintain security, and derive value. Data governance is operationalized by developing policies, incentives, and penalties, cultivating a culture that perceives data as a valuable asset. Specific procedures and assigned authorities are implemented to safeguard data quality and monitor its utilization and related risks.

![An overview of the data governance framework. Source: [StarCIO.](https://www.groundwatergovernance.org/the-importance-of-governance-for-all-stakeholders/).](images/jpg/data_governance.jpg){#fig-governance}
![An overview of the data governance framework. Source: [StarCIO](https://www.groundwatergovernance.org/the-importance-of-governance-for-all-stakeholders/).](images/jpg/data_governance.jpg){#fig-governance}

Data governance utilizes three integrative approaches: planning and control, organizational, and risk-based.

Expand Down
2 changes: 1 addition & 1 deletion contents/core/dnn_architectures/dnn_architectures.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Deep learning architecture stands for specific representation or organizations o

Neural network architectures have evolved to address specific pattern processing challenges. Whether processing arbitrary feature relationships, exploiting spatial patterns, managing temporal dependencies, or handling dynamic information flow, each architectural pattern emerged from particular computational needs. These architectures, from a computer systems perspective, require an examination of how their computational patterns map to system resources.

Most often the architectures are discussed in terms of their algorithmic structures (MLPs, CNNs, RNNs, Transformers). However, in this chapter we take a more fundamental approach by examining how their computational patterns map to hardware resources. Each section analyzes how specific Pattern Processing Needss influence algorithmic structure and how these structures map to computer system resources. The implications for computer system design require examining how their computational patterns map to hardware resources. The mapping from algorithmic requirements to computer system design involves several key considerations:
Most often the architectures are discussed in terms of their algorithmic structures (MLPs, CNNs, RNNs, Transformers). However, in this chapter we take a more fundamental approach by examining how their computational patterns map to hardware resources. Each section analyzes how specific pattern processing needs influence algorithmic structure and how these structures map to computer system resources. The implications for computer system design require examining how their computational patterns map to hardware resources. The mapping from algorithmic requirements to computer system design involves several key considerations:

1. Memory access patterns: How data moves through the memory hierarchy
2. Computation characteristics: The nature and organization of arithmetic operations
Expand Down
4 changes: 2 additions & 2 deletions contents/core/introduction/introduction.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ The move to statistical approaches fundamentally changed how we think about buil
| | | specific | | |
+---------------------+--------------------------+--------------------------+--------------------------+-------------------------------+

: Evolution of AI - Key Positive Aspects {#tbl-ai-evolution-strengths .hover .striped}
: Evolution of AI - Key Positive Aspects {#tbl-ai-evolution-strengths .hover .striped .column-page-right}

The table serves as a bridge between the early approaches we've discussed and the more recent developments in shallow and deep learning that we'll explore next. It sets the stage for understanding why certain approaches gained prominence in different eras and how each new paradigm built upon and addressed the limitations of its predecessors. Moreover, it illustrates how the strengths of earlier approaches continue to influence and enhance modern AI techniques, particularly in the era of foundation models.

Expand Down Expand Up @@ -420,7 +420,7 @@ This book is designed to guide you from understanding the fundamentals of ML sys

![Overview of the five fundamental system pillars of Machine Learning Systems engineering.](images/png/book_pillars.png){#fig-pillars}

As illustrated in Figure @fig-pillars, the five pillars central to the framework are:
As illustrated in @fig-pillars, the five pillars central to the framework are:

- **Data**: Emphasizing data engineering and foundational principles critical to how AI operates in relation to data.
- **Training**: Exploring the methodologies for AI training, focusing on efficiency, optimization, and acceleration techniques to enhance model performance.
Expand Down
2 changes: 1 addition & 1 deletion contents/core/ml_systems/ml_systems.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -536,7 +536,7 @@ The operational characteristics of these systems reveal another important dimens
+--------------------------+----------------------------------------------------------+----------------------------------------------------------+-----------------------------------------------------------+----------------------------------------------------------+
| Data Privacy | Basic-Moderate (Data leaves device) | High (Data stays in local network) | High (Data stays on phone) | Very High (Data never leaves sensor) |
+--------------------------+----------------------------------------------------------+----------------------------------------------------------+-----------------------------------------------------------+----------------------------------------------------------+
| Computational Power | Very High (Multiple GPUs/TPUs) | High (Edge GPUs) | Moderate (Mobile NPUs/GPUs) | Very Low (MCU/tiny processors) |
| Compute Power | Very High (Multiple GPUs/TPUs) | High (Edge GPUs) | Moderate (Mobile NPUs/GPUs) | Very Low (MCU/tiny processors) |
+--------------------------+----------------------------------------------------------+----------------------------------------------------------+-----------------------------------------------------------+----------------------------------------------------------+
| Energy Consumption | Very High (kW-MW range) | High (100s W) | Moderate (1-10W) | Very Low (mW range) |
+--------------------------+----------------------------------------------------------+----------------------------------------------------------+-----------------------------------------------------------+----------------------------------------------------------+
Expand Down
10 changes: 6 additions & 4 deletions contents/frontmatter/ai/socratiq.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -254,10 +254,12 @@ For persistent issues, please contact us at vj[@]eecs.harvard.edu.

## Providing Feedback

Your feedback helps us improve SocratiQ. You can report technical issues, suggest improvements to quiz questions, or share thoughts about AI responses using the feedback buttons located throughout the interface.
Your feedback helps us improve SocratiQ.

You can submit a [GitHub issue](https://github.com/harvard-edge/cs249r_book/issues), or if you prefer leaving feedback via Google Form, you are welcome to do so via this link:
You can report technical issues, suggest improvements to quiz questions, or share thoughts about AI responses using the feedback buttons located throughout the interface. You can submit a [GitHub issue](https://github.com/harvard-edge/cs249r_book/issues).

[Share Your Feedback]{.btn .btn-primary onclick="window.open('https://docs.google.com/forms/d/e/1FAIpQLSeK8RXgc6kbT1IbWVLjyUhwowp3x1ySbAjUQQqztdDs5ccmmQ/viewform?embedded=true', '_blank')"}
```{=html}
If you prefer leaving feedback via Google Form, you are welcome to do so via this link:
Remember: SocratiQ is designed to help you learn effectively. By consistently engaging with the quizzes, asking questions when needed, and tracking your progress, you'll get the most out of this AI learning assistant.
[Share Your Feedback]{.btn .btn-primary onclick="window.open('https://docs.google.com/forms/d/e/1FAIpQLSeK8RXgc6kbT1IbWVLjyUhwowp3x1ySbAjUQQqztdDs5ccmmQ/viewform?embedded=true', '_blank')"}
```
2 changes: 1 addition & 1 deletion index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ We've created this open-source book to demystify the process of building efficie
As a living and breathing resource, this book is a continual work in progress, reflecting the ever-evolving nature of machine learning systems. Advancements in the ML landscape drive our commitment to keeping this resource updated with the latest insights, techniques, and best practices. We warmly invite you to join us on this journey by contributing your expertise, feedback, and ideas.

## Global Reach
## Global Outreach

Thank you to all our readers and visitors. Your engagement with the material keeps us motivated.

Expand Down
48 changes: 32 additions & 16 deletions tex/header-includes.tex
Original file line number Diff line number Diff line change
@@ -1,28 +1,41 @@
% Package imports

\usepackage[english]{babel}
\usepackage{caption}
\usepackage[outercaption, ragged]{sidecap}
\usepackage{adjustbox}
\usepackage{afterpage}
\usepackage{array}
\usepackage{atbegshi} % Package to insert content at the beginning
\usepackage{etoolbox}
\usepackage{caption}
\usepackage{etoolbox} % For redefining footnotes
\usepackage{fancyhdr}
\usepackage{fontspec}
\usepackage{geometry}
\usepackage{graphicx}
\usepackage{hyperref}
\usepackage{ifthen}
\usepackage{longtable}
\usepackage{marginfix} % Fixes the issue of margin notes being cut off
\usepackage{marginnote}
\usepackage{mathptmx}
\usepackage{newpxtext} % Palatino-like font
\usepackage{ragged2e}
\usepackage{longtable}
\usepackage{sidenotes}
\usepackage{titlesec}
\usepackage{tocloft}
\usepackage{xcolor}
\usepackage[outercaption, ragged]{sidecap}
\usepackage{etoolbox} % For redefining footnotes
\usepackage{sidenotes}
\usepackage{ifthen}
\usepackage{changepage}

\geometry{
paperwidth=7.5in,
paperheight=9.25in,
top=1in,
bottom=1in,
inner=1in,
outer=2.25in,
marginparwidth=1.5in,
twoside
}

% Define the Crimson color
\definecolor{crimson}{HTML}{A51C30}
Expand Down Expand Up @@ -115,13 +128,18 @@
twoside
}

% Redefine \part to do nothing
\renewcommand{\part}[1]{%
\typeout{Skipping \detokenize{#1}}% Print message in the log file
\chapter*{#1} % Render the part title without a number
\addcontentsline{toc}{part}{#1} % Add to TOC without numbering
}

% Ensure \partname is defined, just in case it's referenced elsewhere
\renewcommand{\partname}{}
% % Redefine \part to do nothing
% \renewcommand{\part}[1]{%
% \typeout{Skipping \detokenize{#1}}% Print message in the log file
% }

% % Ensure \partname is defined, just in case it's referenced elsewhere
% \renewcommand{\partname}{}

% % Redefine \part (if you want to apply the Crimson color here)
% \titleformat{\part}[display]
Expand Down Expand Up @@ -184,10 +202,6 @@
\setlength{\cftsubsecnumwidth}{3.25em} % Adjust width for subsection numbers
\setlength{\cftsubsubsecnumwidth}{4em} % Adjust width for subsubsection numbers





% Page numbering setup
\makeatletter
% Store whether we've seen the first of each type
Expand Down Expand Up @@ -235,4 +249,6 @@
\fi
\old@chapter*{#1}%
}
\makeatother
\makeatother

\AtBeginEnvironment{longtable}{\scriptsize} % Adjust to \footnotesize or \scriptsize if needed

0 comments on commit 01eee1e

Please sign in to comment.