TRTC supports four room entry modes. Video call (VideoCall
) and audio call (VoiceCall
) are the call modes, and interactive video streaming (Live
) and interactive audio streaming (VoiceChatRoom
) are the live streaming modes.
The live streaming modes allow a maximum of 100,000 concurrent users in each room with smooth mic on/off. Co-anchoring latency is kept below 300 ms and watch latency below 1,000 ms. The live streaming modes are suitable for use cases such as low-latency interactive live streaming, interactive classrooms for up to 100,000 participants, video dating, online education, remote training, and mega-scale conferencing.
TRTC services use two types of server nodes: access servers and proxy servers.
In the live streaming modes, TRTC has introduced the concept of "role". Users are either in the role of "anchor" or "audience". Anchors are assigned to access servers, and audience to proxy servers. Each room allows up to 100,000 users in the role of audience.
For audience to speak, they must switch the role (switchRole
) to “anchor”. The switching process involves users being migrated from proxy servers to access servers. TRTC’s low-latency streaming and smooth mic on/off technologies help keep this process short.
You can visit GitHub to obtain the sample code used in this document.
Note:If your access to GitHub is slow, download the ZIP file here.
You can integrate the TRTC SDK into your project in the following ways:
The TRTC SDK has been released to the mavenCentral repository, and you can configure Gradle to download updates automatically.
The TRTC SDK has integrated TRTC-API-Example
, which offers sample code for your reference. Use Android Studio to open your project and follow the steps below to modify the app/build.gradle
file.
Add the TRTC SDK dependency to dependencies
.
dependencies {
compile 'com.tencent.liteav:LiteAVSDK_TRTC:latest.release'
}
In defaultConfig
, specify the CPU architecture to be used by your application.
Note:Currently, the TRTC SDK supports armeabi-v7a, and arm64-v8a.
defaultConfig {
ndk {
abiFilters "armeabi-v7a", "arm64-v8a"
}
}
You can download the ZIP file of the SDK and integrate it into your project as instructed in SDK Quick Integration > Android.
Add camera, mic, and network permission requests in AndroidManifest.xml
.
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.READ_PHONE_STATE" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
Call the sharedInstance() API to create a TRTCCloud
instance.
// Create a `TRTCCloud` instance
mTRTCCloud = TRTCCloud.sharedInstance(getApplicationContext());
mTRTCCloud.setListener(new TRTCCloudListener());
Set the attributes of setListener
to subscribe to event callbacks and listen for event and error notifications.
// Error notifications indicate that the SDK has stopped working and therefore must be listened for
@Override
public void onError(int errCode, String errMsg, Bundle extraInfo) {
Log.d(TAG, "sdk callback onError");
if (activity != null) {
Toast.makeText(activity, "onError: " + errMsg + "[" + errCode+ "]" , Toast.LENGTH_SHORT).show();
if (errCode == TXLiteAVCode.ERR_ROOM_ENTER_FAIL) {
activity.exitRoom();
}
}
}
TRTCParams
When calling the enterRoom() API, you need to pass in a key parameter TRTCParams, which includes the following required fields:
Parameter | Type | Description | Example |
---|---|---|---|
sdkAppId | Number | Application ID, which you can view in the TRTC console. | 1400000123 |
userId | String | Can contain only letters (a-z and A-Z), digits (0-9), underscores, and hyphens. We recommend you set it based on your business account system. | test_user_001 |
userSig | String | userSig is calculated based on userId . For the calculation method, please see UserSig. |
eJyrVareCeYrSy1SslI... |
roomId | Number | Numeric room ID. For string-type room ID, use strRoomId in TRTCParams . |
29834 |
Note:
- In TRTC, users with the same
userId
cannot be in the same room at the same time as it will cause a conflict.- The value of
appScene
must be the same on each client. InconsistentappScene
may cause unexpected problems.
Fill
: aspect fill. The image may be scaled up and cropped, but there are no black bars.Fit
: aspect fit. The image may be scaled down to ensure that it’s displayed in its entirety, and there may be black bars.// Sample code: publish the local audio/video stream
mTRTCCloud.setLocalViewFillMode(TRTC_VIDEO_RENDER_MODE_FIT);
mTRTCCloud.startLocalPreview(mIsFrontCamera, localView);
// Set local video encoding parameters
TRTCCloudDef.TRTCVideoEncParam encParam = new TRTCCloudDef.TRTCVideoEncParam();
encParam.videoResolution = TRTCCloudDef.TRTC_VIDEO_RESOLUTION_960_540;
encParam.videoFps = 15;
encParam.videoBitrate = 1200;
encParam.videoResolutionMode = TRTCCloudDef.TRTC_VIDEO_RESOLUTION_MODE_PORTRAIT;
mTRTCCloud.setVideoEncoderParam(encParam);
mTRTCCloud.startLocalAudio();
Smooth
: smooth. This style features more obvious skin smoothing effect and is typically used by influencers.Nature
: natural. This style retains more facial details and is more natural.Pitu
: this style is supported only in the Professional Edition.5
is recommended).5
is recommended).role
field in TRTCParams to TRTCCloudDef.TRTCRoleAnchor
to take the role of “anchor”.appScene
, and a room whose ID is the value of the roomId
field in TRTCParams
will be created.TRTCCloudDef.TRTC_APP_SCENE_LIVE
: the interactive video streaming mode, which is used in the example of this documentTRTCCloudDef.TRTC_APP_SCENE_VOICE_CHATROOM
: the interactive audio streaming moderesult
is greater than 0, room entry succeeds, and the value indicates the time (ms) room entry takes; if result
is less than 0, room entry fails, and the value is the error code for the failure.public void enterRoom() {
TRTCCloudDef.TRTCParams trtcParams = new TRTCCloudDef.TRTCParams();
trtcParams.sdkAppId = sdkappid;
trtcParams.userId = userid;
trtcParams.roomId = 908;
trtcParams.userSig = usersig;
mTRTCCloud.enterRoom(trtcParams, TRTCCloudDef.TRTC_APP_SCENE_LIVE);
}
@Override
public void onEnterRoom(long result) {
if (result > 0) {
toastTip("Entered room successfully; the total time used is [\(result)] ms")
} else {
toastTip("Failed to enter the room; the error code is [\(result)]")
}
}
role
field in TRTCParams to TRTCCloudDef.TRTCRoleAudience
to take the role of “audience”.roomId
field in TRTCParams
, specifying appScene
.TRTCCloudDef.TRTC_APP_SCENE_LIVE
: the interactive video streaming mode, which is used in the example of this documentTRTCCloudDef.TRTC_APP_SCENE_VOICE_CHATROOM
: the interactive audio streaming modeuserId
, call startRemoteView(userId, view) with the anchor’s userId
passed in to play the anchor's video.userId
, find the anchor’s userId
in the onUserVideoAvailable() callback, which you will receive after room entry, and call startRemoteView(userId, view) with the anchor’s userId
passed in to play the anchor’s video.TRTCCloudDef.TRTCRoleAnchor
).// Sample code: start co-anchoring
mTrtcCloud.switchRole(TRTCCloudDef.TRTCRoleAnchor);
mTrtcCloud.startLocalAudio();
mTrtcCloud.startLocalPreview(mIsFrontCamera, localView);
// Sample code: end co-anchoring
mTrtcCloud.switchRole(TRTCCloudDef.TRTCRoleAudience);
mTrtcCloud.stopLocalAudio();
mTrtcCloud.stopLocalPreview();
Anchors from two rooms can compete with each other without exiting their current rooms.
roomId
and userId
of anchor B in the format of {"roomId": 978,"userId": "userB"}
to the API.connectOtherRoom()
to call anchor B in room “002” successfully, all users in room "001" will receive the onUserVideoAvailable(B, true)
and onUserAudioAvailable(B, true)
callbacks, and all users in room "002" will receive the onUserVideoAvailable(A, true)
and onUserAudioAvailable(A, true)
callbacks.// Sample code: cross-room competition
mTRTCCloud.ConnectOtherRoom(String.format("{\"roomId\":%s,\"userId\":\"%s\"}", roomId, username));
Call exitRoom() to exit the room. The SDK disables and releases devices such as cameras and mics during room exit. Therefore, room exit is not an instant process. It completes only after the onExitRoom() callback is received.
// Please wait for the `onExitRoom` callback after calling the room exit API.
mTRTCCloud.exitRoom()
@Override
public void onExitRoom(int reason) {
Log.i(TAG, "onExitRoom: reason = " + reason);
}
Note:If your application integrates multiple audio/video SDKs, please wait after you receive the
onExitRoom
callback to start other SDKs; otherwise, the device busy error may occur.
Was this page helpful?