Skip to content

Commit

Permalink
Merge pull request #3682 from AgoraIO/j-api-batch5
Browse files Browse the repository at this point in the history
J api batch5
  • Loading branch information
jinyuagora authored Jul 8, 2024
2 parents 7173fad + cdcc4c0 commit 252416b
Show file tree
Hide file tree
Showing 8 changed files with 127 additions and 120 deletions.
35 changes: 19 additions & 16 deletions dita/RTC-NG/API/api_imediaengine_pullaudioframe.dita
Original file line number Diff line number Diff line change
Expand Up @@ -24,23 +24,26 @@
<codeblock props="flutter" outputclass="language-dart">Future&lt;void&gt; pullAudioFrame(AudioFrame frame);</codeblock>
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc">
<title>详情</title>
<p>使用该方法前,你需要调用 <xref keyref="setExternalAudioSink"/><codeph>(<parmname>enabled</parmname>: <ph keyref="true"/>)</codeph> 通知 App 开启并设置外部渲染。</p>
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>调用该方法后,App 会采取主动拉取的方式获取远端已解码和混音后的音频数据,用于音频播放。</p>
<note type="attention">
<ul>
<li>该方法需要在加入频道后调用。</li>
<li>该方法和 <xref keyref="onPlaybackAudioFrame"/> 回调均可用于获取远端混音后的音频数据。需要注意的是:在调用 <apiname keyref="setExternalAudioSink"/> 开启外部音频渲染后,App 将无法从 <apiname keyref="onPlaybackAudioFrame"/> 回调中获得数据,因此,请根据实际的业务需求在该方法和 <apiname keyref="onPlaybackAudioFrame"/> 回调之间进行选择。二者的处理机制不同,具体区别如下:
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
<p>该方法需要在加入频道后调用。</p>
<p>调用该方法前,你需要调用 <xref keyref="setExternalAudioSink"/><codeph>(<parmname>enabled</parmname>: <ph keyref="true"/>)</codeph> 开启并设置外部渲染。</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>调用限制</title>
<p>该方法和 <xref keyref="onPlaybackAudioFrame"/> 回调均可用于获取远端混音后的音频播放数据。在调用 <apiname keyref="setExternalAudioSink"/> 开启外部音频渲染后,App 将无法从 <apiname keyref="onPlaybackAudioFrame"/> 回调中获得数据。因此,请根据实际的业务需求在该方法和 <apiname keyref="onPlaybackAudioFrame"/> 回调之间进行选择。二者的处理机制不同,具体区别如下:
<ul>
<li>调用该方法后 App 会主动拉取音频数据。通过设置音频数据,SDK 可以调整缓存,帮助 App 处理延时,从而有效避免音频播放抖动。</li>
<li>SDK 通过 <apiname keyref="onPlaybackAudioFrame"/> 回调将音频数据传输给 App。如果 App 处理延时,可能会导致音频播放抖动。</li>
</ul></li>
<li>该方法仅用于拉取远端混音后的音频数据,如需获取采集、播放等不同音频处理阶段的音频数据,可以通过调用 <xref keyref="registerAudioFrameObserver"/> 注册相应的回调。</li>
</ul> </note> </section>
<section id="parameters" props="native unreal bp unity flutter cs">
<title>参数</title>
<parml>
<li>调用该方法后App 会主动拉取音频数据。通过设置音频数据,SDK 可以调整缓存,帮助 App 处理延时,从而有效避免音频播放抖动。</li>
<li>注册 <apiname keyref="onPlaybackAudioFrame"/> 回调后,SDK 通过该回调将音频数据传输给 App。App 在处理音频帧的延迟时,可能会导致音频播放抖动。</li>
</ul></p>
<p>该方法仅用于拉取远端混音后的音频播放数据,如需获取采集的原始音频数据、混音前每条拉流单独的原始音频播放数据等,可以通过调用 <xref keyref="registerAudioFrameObserver"/> 注册相应的回调。</p>
</section>
<section id="parameters" deliveryTarget="details">
<title><ph props="android apple cpp unreal bp flutter unity cs">参数</ph></title>
<parml props="android apple cpp unreal bp flutter unity cs">
<plentry props="cpp unreal bp unity flutter cs">
<pt>frame</pt>
<pd>指向 <xref keyref="AudioFrame"/> 的指针。</pd>
Expand Down Expand Up @@ -69,5 +72,5 @@
<li>方法调用成功,返回一个 <apiname keyref="AudioFrame" /> 对象。</li>
<li>方法调用失败,返回错误码。</li>
</ul> </section>
</refbody>
</refbody>
</reference>
20 changes: 13 additions & 7 deletions dita/RTC-NG/API/api_imediaengine_pushaudioframe0.dita
Original file line number Diff line number Diff line change
Expand Up @@ -23,15 +23,21 @@
<codeblock props="flutter" outputclass="language-dart">Future&lt;void> pushAudioFrame({required AudioFrame frame, int trackId = 0});</codeblock>
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc">
<title>详情</title>
<note type="attention">
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>调用该方法通过音频轨道推送外部音频帧。</p>
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
<p>调用该方法推送外部音频数据前,请先进行以下操作:<ol>
<li>调用 <xref keyref="createCustomAudioTrack"/> 创建音频轨道并获得音频轨道 ID。</li>
<li>调用 <xref keyref="joinChannel2"/> 加入频道时,将 <xref keyref="ChannelMediaOptions"/> 中的 <parmname>publishCustomAudioTrackId</parmname> 设置为你想要发布的音频轨道 ID,并将 <parmname>publishCustomAudioTrack</parmname> 设置为 <codeph><ph keyref="true"/></codeph>。</li>
</ol></p>
</note> </section>
<section id="parameters">
</section>
<section id="restriction" deliveryTarget="details">
<title>调用限制</title>
<p>无。</p>
</section>
<section id="parameters" deliveryTarget="details">
<title>参数</title>
<parml>
<plentry>
Expand All @@ -46,8 +52,8 @@
<section id="return_values">
<title><ph keyref="return-section-title"/></title>
<p props="flutter">方法成功调用时,无返回值;方法调用失败时,会抛出 <xref keyref="AgoraRtcException"/> 异常,你需要捕获异常并进行处理。<ph props="cn">详见<xref keyref="error-code-link"/>了解详情和解决建议。</ph></p>
<ul>
<li props="cpp unreal bp unity electron rn cs">0:方法调用成功。</li>
<ul props="cpp unreal bp unity electron rn cs">
<li>0:方法调用成功。</li>
<li>&lt; 0:方法调用失败。<ph props="cn">详见<xref keyref="error-code-link"/>了解详情和解决建议。</ph></li>
</ul> </section>
</refbody>
Expand Down
39 changes: 21 additions & 18 deletions dita/RTC-NG/API/api_imediaengine_setexternalaudiosink.dita
Original file line number Diff line number Diff line change
Expand Up @@ -34,33 +34,36 @@
{required bool enabled, required int sampleRate, required int channels});</codeblock>
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc">
<title>详情</title>
<p>该方法适用于需要自行渲染音频的场景。开启外部音频渲染后,你可以调用 <xref keyref="pullAudioFrame" /> 拉取远端音频数据。App 可以对拉取到的原始音频数据进行处理后再渲染,获取想要的音频效果。</p>
<note props="hide">
<ul>
<li>使用该方法前,你需要在 <xref keyref="RtcEngineConfig" /> 中设置 <parmname>enableAudioDevice</parmname> 为 <codeph><ph keyref="false" /></codeph>。</li>
<li>该方法需要在加入频道前调用。</li>
<li>开启外部音频渲染后,App 会无法从 <xref keyref="onPlaybackAudioFrame" /> 回调中获得数据。</li>
</ul> </note> </section>
<section id="parameters">
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>调用该方法开启外部音频渲染后,你可以调用 <xref keyref="pullAudioFrame" /> 拉取远端音频数据。App 可以对拉取到的原始音频数据进行处理后再渲染,获取想要的音频效果。</p>
</section>
<section id="scenario" deliveryTarget="details">
<title>适用场景</title>
<p>该方法适用于需要自行渲染音频的场景。</p>
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
<p>该方法需要在加入频道前调用。</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>调用限制</title>
<p>调用该方法开启外部音频渲染后,App 将无法从 <xref keyref="onPlaybackAudioFrame"/> 回调中获得数据。</p>
</section>
<section id="parameters" deliveryTarget="details">
<title>参数</title>
<parml>
<plentry>
<pt>enabled</pt>
<pd>
<p>设置是否开启外部音频渲染:
<pd>设置是否开启外部音频渲染:
<ul>
<li><codeph><ph keyref="true" /></codeph>:开启外部音频渲染。</li>
<li><codeph><ph keyref="false" /></codeph>:(默认)关闭外部音频渲染。</li>
</ul></p>
</ul>
</pd>
</plentry>
<plentry>
<pt>sampleRate</pt>
<pd>
<p>外部音频渲染的采样率 (Hz),可设置为 16000,32000,44100 或 48000。</p>
</pd>
<pd>外部音频渲染的采样率 (Hz),可设置为 16000,32000,44100 或 48000。</pd>
</plentry>
<plentry>
<pt>channels</pt>
Expand All @@ -71,12 +74,12 @@
</ul></pd>
</plentry>
</parml> </section>
<section id="return_values">
<section id="return_values" props="android cpp framework">
<title><ph keyref="return-section-title"/></title>
<p props="flutter">方法成功调用时,无返回值;方法调用失败时,会抛出 <xref keyref="AgoraRtcException"/> 异常,你需要捕获异常并进行处理。<ph props="cn">详见<xref keyref="error-code-link"/>了解详情和解决建议。</ph></p>
<ul props="android cpp unreal bp electron unity rn cs">
<li>0: 方法调用成功。</li>
<li>&lt; 0: 方法调用失败。<ph props="cn">详见<xref keyref="error-code-link"/>了解详情和解决建议。</ph></li>
</ul> </section>
</refbody>
</refbody>
</reference>
41 changes: 21 additions & 20 deletions dita/RTC-NG/API/api_imediaengine_setexternalaudiosource2.dita
Original file line number Diff line number Diff line change
Expand Up @@ -48,57 +48,58 @@
bool publish = true});</codeblock>
<codeblock props="reserve" outputclass="language-cpp"/></p>
</section>
<section id="detailed_desc">
<title>详情</title>
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<dl outputclass="deprecated">
<dlentry>
<dt>弃用:</dt>
<dd>该方法已废弃,请改用 <xref keyref="createCustomAudioTrack" />。</dd>
</dlentry>
</dl>
<note type="attention">请在加入频道前调用该方法。</note> </section>
<section id="parameters">
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
<p>该方法需要在加入频道前调用。</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>调用限制</title>
<p>无。</p>
</section>
<section id="parameters" deliveryTarget="details">
<title>参数</title>
<parml>
<plentry>
<plentry id="enabled">
<pt>enabled</pt>
<pd>
<p>是否开启使用外部音频源的功能:

<pd>是否开启使用外部音频源的功能:
<ul>
<li><codeph><ph keyref="true"/></codeph>:开启外部音频源。</li>
<li><codeph><ph keyref="false"/></codeph>:(默认)关闭外部音频源。</li>
</ul></p>
</ul>
</pd>
</plentry>
<plentry>
<plentry id="samplerate">
<pt>sampleRate</pt>
<pd>外部音频源的采样率 (Hz),可设置为 <codeph>8000</codeph>,<codeph>16000</codeph>,<codeph>32000</codeph>,<codeph>44100</codeph> 或 <codeph>48000</codeph>。</pd>
</plentry>
<plentry>
<plentry id="channels">
<pt>channels</pt>
<pd>外部音频源的声道数,可设置为 <codeph>1</codeph>(单声道)或 <codeph>2</codeph>(双声道)。</pd>
</plentry>
<plentry>
<pt>localPlayback</pt>
<pd id="localplayback">
<p>是否在本地播放外部音频源:

<pd id="localplayback">是否在本地播放外部音频源:
<ul>
<li><codeph><ph keyref="true"/></codeph>:在本地播放。</li>
<li><codeph><ph keyref="false"/></codeph>:(默认)不在本地播放。</li>
</ul></p>
</ul>
</pd>
</plentry>
<plentry>
<pt>publish</pt>
<pd>
<p>是否将音频发布到远端:

<ul id="ul_agk_dnf_3qb">
<pd>是否将音频发布到远端:
<ul>
<li><codeph><ph keyref="true"/></codeph>:(默认)发布到远端。</li>
<li><codeph><ph keyref="false"/></codeph>:不发布到远端。</li>
</ul></p>
</ul>
</pd>
</plentry>
</parml> </section>
Expand Down
30 changes: 19 additions & 11 deletions dita/RTC-NG/API/api_imediaengine_setexternalvideosource.dita
Original file line number Diff line number Diff line change
Expand Up @@ -40,15 +40,23 @@
SenderOptions encodedVideoOption = const SenderOptions()});</codeblock>
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section id="detailed_desc">
<title>详情</title>
<note type="attention">请在加入频道前调用该方法。</note> </section>
<section id="parameters">
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<p>调用该方法启用外部视频源后,你可以调用 <xref keyref="pushVideoFrame"/> 向 SDK 推送外部视频数据。</p>
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
<p>该方法需要在加入频道前调用。</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>调用限制</title>
<p>不支持在频道内动态切换视频源。如果已调用该方法启用外部视频源并加入频道,若想切换为内部视频源,必须先退出频道,然后调用该方法关闭外部视频源,再重新加入频道。</p>
</section>
<section id="parameters" deliveryTarget="details">
<title>参数</title>
<parml>
<plentry>
<pt props="ios mac android">enable</pt>
<pt props="cpp unreal bp unity electron rn flutter cs">enabled</pt>
<pt props="android hmos apple">enable</pt>
<pt props="cpp framework">enabled</pt>
<pd>是否启用外部视频源:
<ul>
<li><codeph><ph keyref="true" /></codeph>: 启用外部视频源。SDK 准备接收外部视频帧。</li>
Expand All @@ -67,18 +75,18 @@
<pt>sourceType</pt>
<pd>外部视频帧是否编码,详见 <xref keyref="EXTERNAL_VIDEO_SOURCE_TYPE" />。</pd>
</plentry>
<plentry props="unity cpp unreal bp electron rn flutter cs">
<plentry props="cpp framework">
<pt>encodedVideoOption</pt>
<pd>视频编码选项。如果 <parmname>sourceType</parmname> 为 <apiname keyref="ENCODED_VIDEO_FRAME" />,则需要设置该参数。你可以<xref keyref="ticket-link" />了解如何设置该参数。</pd>
</plentry>
</parml> </section>
<section id="return_values" props="android cpp unreal bp electron unity rn flutter cs">
<section id="return_values" props="android hmos cpp framework">
<title><ph keyref="return-section-title"/></title>
<p props="flutter">方法成功调用时,无返回值;方法调用失败时,会抛出 <xref keyref="AgoraRtcException"/> 异常,你需要捕获异常并进行处理。<ph props="cn">详见<xref keyref="error-code-link"/>了解详情和解决建议。</ph></p>
<ul>
<li props="android cpp unreal bp electron unity rn cs">0:方法调用成功。</li>
<ul props="android hmos cpp unreal bp electron unity rn cs">
<li>0:方法调用成功。</li>
<li>&lt; 0: 方法调用失败。<ph props="cn">详见<xref keyref="error-code-link"/>了解详情和解决建议。</ph></li>
</ul>
</section>
</refbody>
</refbody>
</reference>
12 changes: 7 additions & 5 deletions dita/RTC-NG/API/api_irtcengine_pullaudioframe2.dita
Original file line number Diff line number Diff line change
Expand Up @@ -23,17 +23,19 @@
<codeblock props="flutter" outputclass="language-dart"/>
<codeblock props="reserve" outputclass="language-cpp"></codeblock></p>
</section>
<section conkeyref="pullAudioFrame/detailed_desc" id="detailed_desc"/>
<section id="parameters">
<section conkeyref="pullAudioFrame/detailed_desc" id="detailed_desc" deliveryTarget="details" otherprops="no-title" />
<section conkeyref="pullAudioFrame/timing" id="timing" deliveryTarget="details" />
<section conkeyref="pullAudioFrame/restriction" id="restriction" deliveryTarget="details" />
<section id="parameters" deliveryTarget="details">
<title>参数</title>
<parml>
<plentry>
<pt>data</pt>
<pd>待拉取的远端音频数据,数据类型为 <codeph>ByteBuffer</codeph>。</pd>
</plentry>
<plentry>
<pt>lengthInByte</pt>
<pd>远端音频数据长度,单位为字节。 该参数的值由音频数据时长、<apiname keyref="setExternalAudioSink"/> 的 <parmname>sampleRate</parmname> 和 <parmname>channels</parmname> 参数确定。<parmname>lengthInByte</parmname> = <parmname>sampleRate</parmname>/1000 × 2 × <parmname>channels</parmname> × 音频数据时长 (ms)。</pd>
<plentry conkeyref="pullAudioFrame/length">
<pt/>
<pd/>
</plentry>
</parml> </section>
<section id="return_values">
Expand Down
Loading

0 comments on commit 252416b

Please sign in to comment.