核心技术
产品中心
开发者
如何购买
资源中心
Orbbec SDK 文档
欢迎阅读Orbbec SDK(以下简称“SDK”)的使用教程!SDK 不仅提供了简洁的高阶API,又提供全面、灵活的低阶API,能够让您更快速的了解和使用奥比中光3D传感摄像头。
代码示例——Android示例说明
获取所有示例都可以在工程的OrbbecSdkExamples目录中找到
名称 语言 描述 支持产品
HelloOrbbec Java 演示连接到设备获取SDK版本和设备信息 全部
深度示例 Java 通过Pipeline开启指定配置彩色流并渲染 绝大部分,有depth的产品
彩色示例 Java 通过Pipeline开启指定配置深度流并渲染 绝大部分,有Color的设备
红外示例 Java 通过Pipeline开启指定配置红外流并渲染 部分,有IR并且不是左右IR的产品
双IR示例 Java 通过Pipelie开启指定 配置左右IR渲染 部分,有左右IR的产品
流对齐示例 Java 演示对传感器数据流对齐的操作 绝大部分,支持depth和color的产品
热拔插示例 Java 演示设备拔插回调的设置,并获取到插拔后处理的流 全部
IMU示例 Java 获取IMU数据并输出显示 部分,支持IMU的产品
多设备示例 Java 演示对多设备进行操作 全部
点云示例 Java 演示生成深度点云或RGBD点云并保存成ply格式文件 绝大部分,支持depth和color的产品
存储示例 Java 获取彩色存储为png格式,深度存储为raw格式以及通过filter进行格式转换 全部
传感器控制示例 Java 演示对设备和传感器控制命令的操作 全部
录制与回放示例 Java 连接设备开流 , 录制当前视频流到文件,载入视频文件进行回放。 绝大部分,支持depth的产品

Android

基础模块–BaseActivity

Example的Activity继承关系

  1. BaseActivity
  2. ______________________________|____________________________________
  3. | | | |
  4. HelloOrbbecActivity ColorViewerActivity ......(XXXActivity) DepthViewerActivity
HelloOrbbecActvity等都是继承于BaseActivity,BaseActivity主要提供一下功能:
  1. OBContext的初始化和释放;
  2. 分辨率获取;

OBContext的初始化和释放

OBContext是Orbbec SDK的入口,配置日志信息,管理设备。一个应用只能有一个OBContext实例,在OrbbecSdkExamples中,每个Activity都继承于BaseActivity,实际都是在BaseActivity#onStart()时初始化,在BaseActivity#onStop()时释放,始终维持一个OBContext实力。 注意:如果应用软件同时存在多个OBContext实例,那么可能会出错。 BaseActivity.java 演示OBContext的初始化和释放实现
  1. /**
  2. * OBContext is entry of OrbbecSDK, and support only one instance.
  3. */
  4. protected OBContext mOBContext;
  5. protected abstract DeviceChangedCallback getDeviceChangedCallback();
  6. protected void initSDK() {
  7. try {
  8. if (BuildConfig.DEBUG) {
  9. // set debug level in code
  10. OBContext.setLoggerSeverity(LogSeverity.DEBUG);
  11. }
  12. DeviceChangedCallback deviceChangedCallback = getDeviceChangedCallback();
  13. // 1.Initialize the SDK Context and listen device changes
  14. String configFilePath = getXmlConfigFile();
  15. if (!TextUtils.isEmpty(configFilePath)) {
  16. mOBContext = new OBContext(getApplicationContext(), configFilePath, deviceChangedCallback);
  17. } else {
  18. mOBContext = new OBContext(getApplicationContext(), deviceChangedCallback);
  19. }
  20. } catch (OBException e) {
  21. e.printStackTrace();
  22. }
  23. }
  24. protected void releaseSDK() {
  25. try {
  26. // Release SDK Context
  27. if (null != mOBContext) {
  28. mOBContext.close();
  29. }
  30. } catch (OBException e) {
  31. e.printStackTrace();
  32. }
  33. }
执行OBContext的初始化和释放, 以HelloOrbbecActivity.java为例
  1. @Override
  2. protected void onStart() {
  3. super.onStart();
  4. initSDK();
  5. }
  6. @Override
  7. protected void onStop() {
  8. releaseSDK();
  9. super.onStop();
  10. }
因为OrbbecSdkExamples的sample都是独立形式实现,所以每次Actiivty#onStart()时初始化Orbbec SDK,Activity#onStop()时释放Orbbec SDK。 说明:在子类Activity中调用是为了开发者阅读代码时知道SDK的初始化和释放顺序,实际应用开发要保持一个应用只有一个OBContext实例。

设备管理

阅读OrbbecSdkExamples的BaseActivity可以注意到BaseActivity是一个抽象类,抽象方法为
  1. protected abstract DeviceChangedCallback getDeviceChangedCallback();
子类实现接口DeviceChangedCallback, 并实现的抽象方法getDeviceChangedCallback()中返回对应的callback对象; DeviceChangedCallback是设备变化回调接口,当上位机(Host)接入的设备变化时,Orbbec SDK会通过DevicChangedCallback回调通知应用层 OBContext提供了两个构造函数,一个可以传自定义配置文件,一个不需要传自定义配置文件。
  1. public OBContext(Context context, DeviceChangedCallback callback);
  2. public OBContext(Context context, String configPath, DeviceChangedCallback callback);

DeviceChangedCallback实现

在DeviceChangedCallback管理设备变化,
  1. // 在实现设备发现
  2. private DeviceChangedCallback mDeviceChangedCallback = new DeviceChangedCallback() {
  3. @Override
  4. public void onDeviceAttach(DeviceList deviceList) {
  5. // deviceList需要close释放资源
  6. try {
  7. int deviceCount = deviceList.getDeviceCount();
  8. for (int i = 0; i < deviceCount; ++i) {
  9. if (null == mDevice) {
  10. // Cautious: Same index inside deviceList can call DeviceList.getDevice(index) only once.
  11. mDevice = deviceList.getDevice(i); // Remember to close when mDevice no more use
  12. mDeviceInfo = mDevice.getInfo(); // Remember to close when mDeviceInfo no more use
  13. // use device to do something
  14. }
  15. }
  16. } catch (Exception e) {
  17. e.printStackTrace();
  18. } finally {
  19. // always release deviceList
  20. try {
  21. deviceList.close();
  22. } catch (Exception e) {
  23. e.printStackTrace();
  24. }
  25. }
  26. }
  27. @Override
  28. public void onDeviceDetach(DeviceList deviceList) {
  29. // deviceList需要close释放资源
  30. try {
  31. int deviceCount = deviceList.getDeviceCount();
  32. for (int i = 0; i < deviceCount; ++i) {
  33. // 获取断开设备的uid
  34. String uid = deviceList.getUid(i);
  35. // Check is current device disconnection
  36. if (null != mDevice && mDeviceInfo.getUid().equals(uid)) {
  37. // stop stream if application has start stream
  38. // close pipelines if has pipleline create from mDevice
  39. // release device info
  40. mDeviceInfo.close();
  41. mDeviceInfo = null;
  42. // release device
  43. mDevice.close();
  44. mDevice = null;
  45. }
  46. }
  47. } catch (Exception e) {
  48. e.printStackTrace();
  49. } finally {
  50. // always release deviceList
  51. try {
  52. deviceList.close();
  53. } catch (Exception e) {
  54. e.printStackTrace();
  55. }
  56. }
  57. }
  58. };
  59. // BaseActivity的子类要返回callback,以满足OBContext初始化
  60. protected DeviceChangedCallback getDeviceChangedCallback() {
  61. return mDeviceChangedCallback;
  62. }

分辨率获取(单相机)

OrbbecSdkExamples实例有很多需要获取相机分辨率,并打开相机的功能;由于OrbbecSdkExamples是简单实例,仅支持渲染部分分辨率格式,所以获取分辨率时过滤掉不支持的分辨率格式。 通过Pipeline和SensorType获取分辨率
  1. /**
  2. * Get best VideoStreamProfile of pipeline support by OrbbecSdkExamples.
  3. * Note: OrbbecSdkExamples just sample code to render and save frame, it support limit VideoStreamProfile Format.
  4. * @param pipeline Pipeline
  5. * @param sensorType Target Sensor Type
  6. * @return If success return a VideoStreamProfile, otherwise return null.
  7. */
  8. protected final VideoStreamProfile getStreamProfile(Pipeline pipeline, SensorType sensorType) {
  9. // Select prefer Format
  10. Format format;
  11. if (sensorType == SensorType.COLOR) {
  12. format = Format.RGB888;
  13. } else if (sensorType == SensorType.IR
  14. || sensorType == SensorType.IR_LEFT
  15. || sensorType == SensorType.IR_RIGHT) {
  16. format = Format.Y8;
  17. } else if (sensorType == SensorType.DEPTH) {
  18. format = Format.Y16;
  19. } else {
  20. Log.w(TAG, "getStreamProfile not support sensorType: " + sensorType);
  21. return null;
  22. }
  23. try {
  24. // Get StreamProfileList from sensor
  25. StreamProfileList profileList = pipeline.getStreamProfileList(sensorType);
  26. List<VideoStreamProfile> profiles = new ArrayList<>();
  27. for (int i = 0, N = profileList.getStreamProfileCount(); i < N; i++) {
  28. // Get StreamProfile by index and convert it to VideoStreamProfile
  29. VideoStreamProfile profile = profileList.getStreamProfile(i).as(StreamType.VIDEO);
  30. // Match target with and format.
  31. // Note: width >= 640 && width <= 1280 is consider best render for OrbbecSdkExamples
  32. if ((profile.getWidth() >= 640 && profile.getWidth() <= 1280) && profile.getFormat() == format) {
  33. profiles.add(profile);
  34. } else {
  35. profile.close();
  36. }
  37. }
  38. // If not match StreamProfile with prefer format, Try other.
  39. // Note: OrbbecSdkExamples not support render Format of MJPEG and RVL
  40. if (profiles.isEmpty() && profileList.getStreamProfileCount() > 0) {
  41. for (int i = 0, N = profileList.getStreamProfileCount(); i < N; i++) {
  42. VideoStreamProfile profile = profileList.getStreamProfile(i).as(StreamType.VIDEO);
  43. if ((profile.getWidth() >= 640 && profile.getWidth() <= 1280)
  44. && (profile.getFormat() != Format.MJPG && profile.getFormat() != Format.RVL)) {
  45. profiles.add(profile);
  46. } else {
  47. profile.close();
  48. }
  49. }
  50. }
  51. // Release StreamProfileList
  52. profileList.close();
  53. // Sort VideoStreamProfile list and let recommend profile at first
  54. // Priority:
  55. // 1. high fps at first.
  56. // 2. large width at first
  57. // 3. large height at first
  58. Collections.sort(profiles, new Comparator<VideoStreamProfile>() {
  59. @Override
  60. public int compare(VideoStreamProfile o1, VideoStreamProfile o2) {
  61. if (o1.getFps() != o2.getFps()) {
  62. return o2.getFps() - o1.getFps();
  63. }
  64. if (o1.getWidth() != o2.getWidth()) {
  65. return o2.getWidth() - o1.getWidth();
  66. }
  67. return o2.getHeight() - o1.getHeight();
  68. }
  69. });
  70. for (VideoStreamProfile p : profiles) {
  71. Log.d(TAG, "getStreamProfile " + p.getWidth() + "x" + p.getHeight() + "--" + p.getFps());
  72. }
  73. if (profiles.isEmpty()) {
  74. return null;
  75. }
  76. // Get first stream profile which is the best for OrbbecSdkExamples.
  77. VideoStreamProfile retProfile = profiles.get(0);
  78. // Release other stream profile
  79. for (int i = 1; i < profiles.size(); i++) {
  80. profiles.get(i).close();
  81. }
  82. return retProfile;
  83. } catch (Exception e) {
  84. e.printStackTrace();
  85. }
  86. return null;
  87. }

获取D2C分辨率

d2c对齐时,需要检查color和depth是否匹配,否则会导致D2C异常,以下示例展示如何获取匹配的D2C分辨率 为了方便,定义D2C的Java bean用于保存匹配的color和depth分辨率
  1. /**
  2. * Data bean bundle VideoStreamProfile of depth and color
  3. */
  4. protected static class D2CStreamProfile implements AutoCloseable {
  5. // color stream profile
  6. private VideoStreamProfile colorProfile;
  7. // depth stream profile
  8. private VideoStreamProfile depthProfile;
  9. // getter && setter
  10. }
以下代码显示如何获取一组D2C分辨率,由于OrbbecSdkExamples支持渲染的分辨率格式有限,所以在获取过程中还会对分辨率格式和宽高进行过滤。
  1. /**
  2. * Get D2CStreamProfile which contain color and depth
  3. * @param pipeline
  4. * @param alignMode
  5. * @return Success: D2CStreamProfile which contain color and depth. Failure: null.
  6. */
  7. protected D2CStreamProfile genD2CStreamProfile(Pipeline pipeline, AlignMode alignMode) {
  8. // Config color profile
  9. VideoStreamProfile colorProfile = null;
  10. List<VideoStreamProfile> colorProfiles = getAvailableColorProfiles(pipeline, alignMode);
  11. if (colorProfiles.isEmpty()) {
  12. Log.w(TAG, "genConfig failed. colorProfiles is empty");
  13. return null;
  14. }
  15. for (VideoStreamProfile profile : colorProfiles) {
  16. if (profile.getWidth() >= 640 && profile.getWidth() <= 1280 && profile.getFormat() == Format.RGB888) {
  17. colorProfile = profile;
  18. break;
  19. }
  20. }
  21. if (null == colorProfile) {
  22. if (colorProfiles.size() > 0) {
  23. colorProfile = colorProfiles.get(0);
  24. } else {
  25. Log.w(TAG, "genConfig failed. not match color profile width >= 640 and width <= 1280");
  26. return null;
  27. }
  28. }
  29. // Release colorProfiles resource
  30. for (VideoStreamProfile profile : colorProfiles) {
  31. if (profile != colorProfile) {
  32. profile.close();
  33. }
  34. }
  35. colorProfiles.clear();
  36. // Config depth profile
  37. VideoStreamProfile depthProfile = null;
  38. List<VideoStreamProfile> depthProfiles = getAvailableDepthProfiles(pipeline, colorProfile, alignMode);
  39. for (VideoStreamProfile profile : depthProfiles) {
  40. if (profile.getWidth() >= 640 && profile.getWidth() <= 1280 && profile.getFormat() == Format.Y16) {
  41. depthProfile = profile;
  42. break;
  43. }
  44. }
  45. if (null == depthProfile) {
  46. if (depthProfiles.size() > 0) {
  47. depthProfile = depthProfiles.get(0);
  48. } else {
  49. Log.w(TAG, "genConfig failed. not match depth profile width >= 640 and width <= 1280");
  50. colorProfile.close();
  51. colorProfile = null;
  52. return null;
  53. }
  54. }
  55. // Release depthProfiles resource
  56. for (VideoStreamProfile profile : depthProfiles) {
  57. if (depthProfile != profile) {
  58. profile.close();
  59. }
  60. }
  61. depthProfiles.clear();
  62. D2CStreamProfile d2CStreamProfile = new D2CStreamProfile();
  63. d2CStreamProfile.colorProfile = colorProfile;
  64. d2CStreamProfile.depthProfile = depthProfile;
  65. return d2CStreamProfile;
  66. }
枚举能支持D2C的color分辨率配置
  1. /**
  2. * Get available color profiles with AlignMode. If alignMode is ALIGN_D2C_HW_ENABLE or ALIGN_D2C_SW_ENABLE
  3. * Not all color stream profile has match depth stream profile list, This function will filter the color stream profile
  4. * when it match any depth stream profile under target alignMode.
  5. * @param pipeline
  6. * @param alignMode
  7. * @return Color stream profile list that has supported depth stream profiles.
  8. */
  9. private List<VideoStreamProfile> getAvailableColorProfiles(Pipeline pipeline, AlignMode alignMode) {
  10. List<VideoStreamProfile> colorProfiles = new ArrayList<>();
  11. StreamProfileList depthProfileList = null;
  12. try (StreamProfileList colorProfileList = pipeline.getStreamProfileList(SensorType.COLOR)) {
  13. final int profileCount = colorProfileList.getStreamProfileCount();
  14. for (int i = 0; i < profileCount; i++) {
  15. colorProfiles.add(colorProfileList.getStreamProfile(i).as(StreamType.VIDEO));
  16. }
  17. sortVideoStreamProfiles(colorProfiles);
  18. // All depth profile are available when D2C is disalbe
  19. if (alignMode == AlignMode.ALIGN_D2C_DISABLE) {
  20. return colorProfiles;
  21. }
  22. // Filter color profile which unsupported depth profile
  23. for (int i = colorProfiles.size() - 1; i >= 0; i--) {
  24. VideoStreamProfile colorProfile = colorProfiles.get(i);
  25. depthProfileList = pipeline.getD2CDepthProfileList(colorProfile, alignMode);
  26. if (null == depthProfileList || depthProfileList.getStreamProfileCount() == 0) {
  27. colorProfiles.remove(i);
  28. colorProfile.close();
  29. }
  30. // Release depthProfileList
  31. depthProfileList.close();
  32. depthProfileList = null;
  33. }
  34. return colorProfiles;
  35. } catch (OBException e) {
  36. e.printStackTrace();
  37. } finally {
  38. // Release depthProfileList when encounter OBException
  39. if (null != depthProfileList) {
  40. depthProfileList.close();
  41. depthProfileList = null;
  42. }
  43. }
  44. return colorProfiles;
  45. }
检查color分辨率是否存在支持的depth分辨率,如果有支持的depth分辨率配置,那么返回对应的分辨率列表;如果没有支持的depth分辨率配置,那么返回空列表;
  1. /**
  2. * Get target depth stream profile list with target color stream profile and alignMode
  3. * @param pipeline Pipeline
  4. * @param colorProfile Target color stream profile
  5. * @param alignMode Target alignMode
  6. * @return Depth stream profile list associate with target color stream profile.
  7. * Success: depth stream profile list has elements. Failure: depth stream profile list is empty.
  8. */
  9. private List<VideoStreamProfile> getAvailableDepthProfiles(Pipeline pipeline, VideoStreamProfile colorProfile, AlignMode alignMode) {
  10. List<VideoStreamProfile> depthProfiles = new ArrayList<>();
  11. try (StreamProfileList depthProfileList = pipeline.getD2CDepthProfileList(colorProfile, alignMode)) {
  12. final int profileCount = depthProfileList.getStreamProfileCount();
  13. for (int i = 0; i < profileCount; i++) {
  14. depthProfiles.add(depthProfileList.getStreamProfile(i).as(StreamType.VIDEO));
  15. }
  16. sortVideoStreamProfiles(depthProfiles);
  17. } catch (OBException e) {
  18. e.printStackTrace();
  19. }
  20. return depthProfiles;
  21. }
D2C的分辨率配置列表排序规则,相同分辨率格式放在一起,width升序排列(最小放在最前),height降序排列(最大放在最前),帧率降序排列(最高帧率放最前面)
  1. private DeviceChangedCallback mDeviceChangedCallback = new DeviceChangedCallback() {
  2. @Override
  3. public void onDeviceAttach(DeviceList deviceList) {
  4. Log.d(TAG, "onDeviceAttach");
  5. try {
  6. deviceList.close();
  7. } catch (Exception e) {
  8. e.printStackTrace();
  9. }
  10. dumpDevices();
  11. }
  12. @Override
  13. public void onDeviceDetach(DeviceList deviceList) {
  14. Log.d(TAG, "onDeviceDetach");
  15. try {
  16. deviceList.close();
  17. } catch (Exception e) {
  18. e.printStackTrace();
  19. }
  20. dumpDevices();
  21. }
  22. };
  23. private void sortVideoStreamProfiles(List<VideoStreamProfile> profiles) {
  24. Collections.sort(profiles, new Comparator<VideoStreamProfile>() {
  25. @Override
  26. public int compare(VideoStreamProfile o1, VideoStreamProfile o2) {
  27. if (o1.getFormat() != o2.getFormat()) {
  28. return o1.getFormat().value() - o2.getFormat().value();
  29. }
  30. if (o1.getWidth() != o2.getWidth()) {
  31. // Little first
  32. return o1.getWidth() - o2.getWidth();
  33. }
  34. if (o1.getHeight() != o2.getHeight()) {
  35. // Large first
  36. return o2.getHeight() - o1.getHeight();
  37. }
  38. // Large first
  39. return o2.getFps() - o1.getFps();
  40. }
  41. });
  42. }

HelloOrbbec

功能描述:用于演示SDK初始化、获取SDK版本、获取设备型号、获取设备序列号、获取固件版本号、SDK释放资源。
  1. // Get device information
  2. DeviceInfo info = mDevice.getInfo();
  3. builder.append("Name: " + info.getName() + "\n");
  4. builder.append("Vid: " + LocalUtils.formatHex04(info.getVid()) + "\n");
  5. builder.append("Pid: " + LocalUtils.formatHex04(info.getPid()) + "\n");
  6. builder.append("Uid: " + info.getUid() + "\n");
  7. builder.append("SN: " + info.getSerialNumber() + "\n");
  8. builder.append("connectType: " + info.getConnectionType() + "\n");
  9. String firmwareVersion = info.getFirmwareVersion();
  10. builder.append("FirmwareVersion: " + firmwareVersion + "\n");
  11. builder.append(dumpExtensionInfo(info.getExtensionInfo()));
  12. // Iterate through the sensors of the current device
  13. for (Sensor sensor : device.querySensors()) {
  14. // 8.Query sensor type
  15. builder.append("Sensor: " + sensor.getType() + "\n");
  16. }
  17. // Release device information
  18. info.close();

彩色示例-ColorViewer

功能描述:本示例主要演示了SDK的初始化、设备创建、Pipeline的初始化及配置以及通过Pipeline开启指定配置彩色流并渲染。 从mDevice构建Pipeline并开流
  1. // check sensor Color exists
  2. Sensor colorSensor = mDevice.getSensor(SensorType.COLOR);
  3. if (null == colorSensor) {
  4. mDevice.close();
  5. mDevice = null;
  6. return;
  7. }
  8. // Create Device and initialize Pipeline through Device
  9. mPipeline = new Pipeline(mDevice);
  10. // Get stream profile BaseActivity#getStreamProfile(Pipeline, SensorType)
  11. VideoStreamProfile streamProfile = getStreamProfile(mPipeline, SensorType.COLOR);
  12. // Create Pipeline configuration
  13. Config config = new Config();
  14. // 6.Enable color StreamProfile
  15. if (null != streamProfile) {
  16. printStreamProfile(streamProfile.as(StreamType.VIDEO));
  17. config.enableStream(streamProfile);
  18. streamProfile.close();
  19. } else {
  20. config.close();
  21. Log.w(TAG, "No target stream profile!");
  22. return;
  23. }
  24. // Start sensor stream
  25. mPipeline.start(config);
  26. // Release config
  27. config.close();
在子线程中从mPipeline读取Color图像数据,并渲染
  1. private Runnable mStreamRunnable = () -> {
  2. ByteBuffer buffer = null;
  3. while (mIsStreamRunning) {
  4. try {
  5. // Obtain the data set in blocking mode. If it cannot be obtained after waiting for 100ms, it will time out.
  6. FrameSet frameSet = mPipeline.waitForFrameSet(100);
  7. Log.d(TAG, "frameSet=" + frameSet);
  8. if (null == frameSet) {
  9. continue;
  10. }
  11. // Get color flow data
  12. ColorFrame colorFrame = frameSet.getColorFrame();
  13. if (null != buffer) {
  14. buffer.clear();
  15. }
  16. Log.d(TAG, "frameSet=" + frameSet + ", colorFrame=" + colorFrame);
  17. if (null != colorFrame) {
  18. Log.d(TAG, "color frame: " + colorFrame.getSystemTimeStamp());
  19. // Initialize buffer
  20. int dataSize = colorFrame.getDataSize();
  21. if (null == buffer || buffer.capacity() != dataSize) {
  22. buffer = ByteBuffer.allocateDirect(dataSize);
  23. }
  24. // Get data and render
  25. colorFrame.getData(buffer);
  26. mColorView.update(colorFrame.getWidth(), colorFrame.getHeight(), StreamType.COLOR, colorFrame.getFormat(), buffer, 1.0f);
  27. // Release color data frame
  28. colorFrame.close();
  29. }
  30. // Release FrameSet
  31. frameSet.close();
  32. } catch (Exception e) {
  33. e.printStackTrace();
  34. }
  35. }
  36. };
关闭流
  1. try {
  2. // Stop the Pipeline and release
  3. if (null != mPipeline) {
  4. mPipeline.stop();
  5. mPipeline.close();
  6. mPipeline = null;
  7. }
  8. } catch (Exception e) {
  9. e.printStackTrace();
  10. }

深度示例-DepthViewer

功能描述:本示例主要演示了SDK的初始化、设备创建、Pipeline的初始化及配置以及通过Pipeline开启指定配置深度流并渲染。 从mDevice构建Pipeline并开流
  1. // Check sensor Depth exists
  2. Sensor depthSensor = mDevice.getSensor(SensorType.DEPTH);
  3. if (null == depthSensor) {
  4. mDevice.close();
  5. mDevice = null;
  6. return;
  7. }
  8. // Create Device and initialize Pipeline through Device
  9. mPipeline = new Pipeline(mDevice);
  10. // Get stream profile BaseActivity#getStreamProfile(Pipeline, SensorType)
  11. VideoStreamProfile streamProfile = getStreamProfile(mPipeline, SensorType.DEPTH);
  12. // Create Pipeline configuration
  13. Config config = new Config();
  14. // 6.Enable depth StreamProfile
  15. if (null != streamProfile) {
  16. printStreamProfile(streamProfile.as(StreamType.VIDEO));
  17. config.enableStream(streamProfile);
  18. streamProfile.close();
  19. } else {
  20. config.close();
  21. Log.w(TAG, "No target stream profile!");
  22. return;
  23. }
  24. // Start sensor stream
  25. mPipeline.start(config);
  26. // Release config
  27. config.close();
在子线程中从mPipeline读取depth图像数据,并渲染
  1. private Runnable mStreamRunnable = () -> {
  2. ByteBuffer buffer = null;
  3. while (mIsStreamRunning) {
  4. try {
  5. // Obtain the data set in blocking mode. If it cannot be obtained after waiting for 100ms, it will time out.
  6. FrameSet frameSet = mPipeline.waitForFrameSet(100);
  7. Log.d(TAG, "frameSet=" + frameSet);
  8. if (null == frameSet) {
  9. continue;
  10. }
  11. // Get depth flow data
  12. DepthFrame depthFrame = frameSet.getDepthFrame();
  13. if (null != buffer) {
  14. buffer.clear();
  15. }
  16. Log.d(TAG, "frameSet=" + frameSet + ", depthFrame=" + depthFrame);
  17. if (null != depthFrame) {
  18. Log.d(TAG, "depth frame: " + depthFrame.getSystemTimeStamp());
  19. // Initialize buffer
  20. int dataSize = depthFrame.getDataSize();
  21. if (null == buffer || buffer.capacity() != dataSize) {
  22. buffer = ByteBuffer.allocateDirect(dataSize);
  23. }
  24. // Get data and render
  25. depthFrame.getData(buffer);
  26. mDepthView.update(frame.getWidth(), frame.getHeight(), StreamType.DEPTH, frame.getFormat(), frameData, frame.getValueScale());
  27. // Release depth data frame
  28. depthFrame.close();
  29. }
  30. // Release FrameSet
  31. frameSet.close();
  32. } catch (Exception e) {
  33. e.printStackTrace();
  34. }
  35. }
  36. };
关闭流
  1. try {
  2. // Stop the Pipeline and release
  3. if (null != mPipeline) {
  4. mPipeline.stop();
  5. mPipeline.close();
  6. mPipeline = null;
  7. }
  8. } catch (Exception e) {
  9. e.printStackTrace();
  10. }

红外示例-InfraredViewer

功能描述:本示例主要演示了SDK的初始化、设备创建、Pipeline的初始化及配置以及通过Pipeline开启指定配置红外流并渲染。 从mDevice构建Pipeline并开流
  1. // Check sensor IR exists
  2. Sensor irSensor = mDevice.getSensor(SensorType.IR);
  3. if (null == irSensor) {
  4. mDevice.close();
  5. mDevice = null;
  6. return;
  7. }
  8. // Create Device and initialize Pipeline through Device
  9. mPipeline = new Pipeline(mDevice);
  10. // Get stream profile BaseActivity#getStreamProfile(Pipeline, SensorType)
  11. VideoStreamProfile streamProfile = getStreamProfile(mPipeline, SensorType.IR);
  12. // Create Pipeline configuration
  13. Config config = new Config();
  14. // 6.Enable IR StreamProfile
  15. if (null != streamProfile) {
  16. printStreamProfile(streamProfile.as(StreamType.VIDEO));
  17. config.enableStream(streamProfile);
  18. streamProfile.close();
  19. } else {
  20. config.close();
  21. Log.w(TAG, "No target stream profile!");
  22. return;
  23. }
  24. // Start sensor stream
  25. mPipeline.start(config);
  26. // Release config
  27. config.close();
在子线程中从mPipeline读取IR图像数据,并渲染
  1. private Runnable mStreamRunnable = () -> {
  2. while (mIsStreamRunning) {
  3. try {
  4. // Obtain the data set in blocking mode. If it cannot be obtained after waiting for 100ms, it will time out.
  5. FrameSet frameSet = mPipeline.waitForFrameSet(100);
  6. if (null == frameSet) {
  7. continue;
  8. }
  9. // Get Infrared flow data
  10. IRFrame frame = frameSet.getIrFrame();
  11. if (frame != null) {
  12. // Get infrared data and render it
  13. byte[] frameData = new byte[frame.getDataSize()];
  14. frame.getData(frameData);
  15. mIrView.update(frame.getWidth(), frame.getHeight(), StreamType.IR, frame.getFormat(), frameData, 1.0f);
  16. // Release infrared data frame
  17. frame.close();
  18. }
  19. // Release FrameSet
  20. frameSet.close();
  21. } catch (Exception e) {
  22. e.printStackTrace();
  23. }
  24. }
  25. };
关闭流
  1. try {
  2. // Stop the Pipeline and release
  3. if (null != mPipeline) {
  4. mPipeline.stop();
  5. mPipeline.close();
  6. mPipeline = null;
  7. }
  8. } catch (Exception e) {
  9. e.printStackTrace();
  10. }

双IR示例–DoubleIRViewer

双IR示例展示如何同时打开左右IR,只有部分产品支持 创建mPipeline配置打开左右IR流
  1. Sensor irLeftSensor = mDevice.getSensor(SensorType.IR_LEFT);
  2. if (null == irLeftSensor) {
  3. showToast(getString(R.string.device_not_support_ir));
  4. mDevice.close();
  5. mDevice = null;
  6. return;
  7. }
  8. Sensor irRightSensor = mDevice.getSensor(SensorType.IR_RIGHT);
  9. if (null == irRightSensor) {
  10. showToast(getString(R.string.device_not_support_ir_right));
  11. mDevice.close();
  12. mDevice = null;
  13. return;
  14. }
  15. mPipeline = new Pipeline(mDevice);
  16. // 3.Initialize stream profile
  17. Config config = initStreamProfile(mPipeline);
  18. if (null == config) {
  19. showToast(getString(R.string.init_stream_profile_failed));
  20. mPipeline.close();
  21. mPipeline = null;
  22. mDevice.close();
  23. mDevice = null;
  24. config.close();
  25. return;
  26. }
  27. // 4.Start sensor stream
  28. mPipeline.start(config);
  29. // 5.Release config
  30. config.close();
  31. // 6.Create a thread to obtain Pipeline data
  32. start();
initStreamProfile(Pipeline)实现
  1. private Config initStreamProfile(Pipeline pipeline) {
  2. // 1.Create Pipeline configuration
  3. Config config = new Config();
  4. SensorType sensorTypes[] = {SensorType.IR_LEFT, SensorType.IR_RIGHT};
  5. for (SensorType sensorType : sensorTypes) {
  6. // Obtain the stream configuration and configure it to Config, where the matching
  7. // is performed according to the width and frame rate, and the matching satisfies
  8. // the configuration with a width of 640 and a frame rate of 30fps
  9. VideoStreamProfile irStreamProfile = getStreamProfile(pipeline, sensorType);
  10. // Enable ir left StreamProfile
  11. if (null != irStreamProfile) {
  12. Log.d(TAG, "irStreamProfile: " + sensorType);
  13. printStreamProfile(irStreamProfile.as(StreamType.VIDEO));
  14. config.enableStream(irStreamProfile);
  15. irStreamProfile.close();
  16. } else {
  17. Log.w(TAG, ": No target stream profile! ir left stream profile is null");
  18. config.close();
  19. return null;
  20. }
  21. }
  22. return config;
  23. }
获取左右IR图像数据并渲染
  1. private Runnable mStreamRunnable = () -> {
  2. FrameType frameTypes[] = {FrameType.IR_LEFT, FrameType.IR_RIGHT};
  3. while (mIsStreamRunning) {
  4. try {
  5. // Obtain the data set in blocking mode. If it cannot be obtained after waiting for 100ms, it will time out.
  6. FrameSet frameSet = mPipeline.waitForFrameSet(100);
  7. if (null == frameSet) {
  8. continue;
  9. }
  10. // Get Infrared flow data
  11. for (int i = 0; i < frameTypes.length; i++) {
  12. IRFrame frame = frameSet.getFrame(frameTypes[i]);
  13. if (frame != null) {
  14. // Get infrared data and render it
  15. byte[] frameData = new byte[frame.getDataSize()];
  16. frame.getData(frameData);
  17. OBGLView glView = frameTypes[i] == FrameType.IR_LEFT ? mIrLeftView : mIrRightView;
  18. glView.update(frame.getWidth(), frame.getHeight(), StreamType.IR, frame.getFormat(), frameData, 1.0f);
  19. // Release infrared data frame
  20. frame.close();
  21. }
  22. }
  23. // Release FrameSet
  24. frameSet.close();
  25. } catch (Exception e) {
  26. e.printStackTrace();
  27. }
  28. }
  29. };
关闭流
  1. try {
  2. // Stop the Pipeline and release
  3. if (null != mPipeline) {
  4. mPipeline.stop();
  5. mPipeline.close();
  6. mPipeline = null;
  7. }
  8. } catch (Exception e) {
  9. e.printStackTrace();
  10. }

流对齐示例-SyncAlignViewer

功能描述:本示例主要演示了对数据流控制对齐的操作,该示例可能会由于深度或者彩色sensor不支持镜像而出现深度图和彩色图镜像状态不一致的情况, 从而导致深度图和彩色图显示的图像是相反的,如遇到该情况,则通过设置镜像接口保持两个镜像状态一致即可 另外可能存在某些设备获取到的分辨率不支持D2C功能,因此D2C功能以实际支持的D2C分辨率为准 例如:DaBai DCW支持的D2C的分辨率为640x360,而实际该示例获取到的分辨率可能为640x480,此时用户修改成640x360分辨率即可。 在基础模块BaseActivity中,我们介绍了genD2CStreamProfile(),流对齐需要同时配置depth和color的分辨率,以及对齐类型,实现方法如下:
  1. private Config genD2CConfig(Pipeline pipeline, AlignMode alignMode) {
  2. // Get D2CStreamProfile BaseActivity#genD2CStreamProfile(Pipeline, AlignMode)
  3. D2CStreamProfile d2CStreamProfile = genD2CStreamProfile(pipeline, alignMode);
  4. if (null == d2CStreamProfile) {
  5. return null;
  6. }
  7. // Update color information to UI
  8. VideoStreamProfile colorProfile = d2CStreamProfile.getColorProfile();
  9. // Update depth information to UI
  10. VideoStreamProfile depthProfile = d2CStreamProfile.getDepthProfile();
  11. Config config = new Config();
  12. // set D2C AlignMode
  13. config.setAlignMode(alignMode);
  14. config.enableStream(d2CStreamProfile.getColorProfile());
  15. config.enableStream(d2CStreamProfile.getDepthProfile());
  16. d2CStreamProfile.close();
  17. return config;
  18. }
构建Pipeline并开流
  1. if (null == mDevice.getSensor(SensorType.DEPTH)) {
  2. mDevice.close();
  3. mDevice = null;
  4. showToast(getString(R.string.device_not_support_depth));
  5. return;
  6. }
  7. if (null == mDevice.getSensor(SensorType.COLOR)) {
  8. mDevice.close();
  9. mDevice = null;
  10. showToast(getString(R.string.device_not_support_color));
  11. return;
  12. }
  13. mPipeline = new Pipeline(mDevice);
  14. // 3.创建Pipeline配置
  15. mConfig = genD2CConfig(mPipeline, AlignMode.ALIGN_D2C_DISABLE);
  16. if (null == mConfig) {
  17. mPipeline.close();
  18. mPipeline = null;
  19. mDevice.close();
  20. mDevice = null;
  21. Log.w(TAG, "onDeviceAttach: No target depth and color stream profile!");
  22. showToast(getString(R.string.init_stream_profile_failed));
  23. return;
  24. }
  25. // 4.通过config开流
  26. mPipeline.start(mConfig);
  27. // 5.创建获取Pipeline数据线程
  28. start();

获取数据并渲染

  1. private Runnable mStreamRunnable = () -> {
  2. while (mIsStreamRunning) {
  3. try {
  4. // 等待100ms后如果获取不到,则超时
  5. FrameSet frameSet = mPipeline.waitForFrameSet(100);
  6. if (null == frameSet) {
  7. continue;
  8. }
  9. // 获取深度流数据
  10. DepthFrame depthFrame = frameSet.getDepthFrame();
  11. // 获取彩色流数据
  12. ColorFrame colorFrame = frameSet.getColorFrame();
  13. // 深度和彩色叠加渲染
  14. depthOverlayColorProcess(depthFrame, colorFrame);
  15. // 释放深度帧
  16. if (null != depthFrame) {
  17. depthFrame.close();
  18. }
  19. // 释放彩色帧
  20. if (null != colorFrame) {
  21. colorFrame.close();
  22. }
  23. // 释放数据集
  24. frameSet.close();
  25. } catch (Exception e) {
  26. e.printStackTrace();
  27. }
  28. }
  29. };
depthOverlayColorProcess()取得的color和depth数据解码并渲染,具体代码请看OrbbecSdkExamples源码SyncAlignViewerActivity.java 关闭流
  1. try {
  2. // Stop the Pipeline and release
  3. if (null != mPipeline) {
  4. mPipeline.stop();
  5. mPipeline.close();
  6. mPipeline = null;
  7. }
  8. } catch (Exception e) {
  9. e.printStackTrace();
  10. }

热拔插示例-HotPlugin

功能描述:本示例主要演示设备拔插回调的设置,以及拔插之后处理获取到的流。 定义Color的FrameCallback
  1. private FrameCallback mColorFrameCallback = frame -> {
  2. printFrameInfo(frame.as(FrameType.COLOR), mColorFps);
  3. // 释放frame资源
  4. frame.close();
  5. };
定义Depth的FrameCallback
  1. private FrameCallback mDepthFrameCallback = frame -> {
  2. printFrameInfo(frame.as(FrameType.DEPTH), mDepthFps);
  3. // 释放frame资源
  4. frame.close();
  5. };
定义IR的FrameCallback
  1. private FrameCallback mIrFrameCallback = frame -> {
  2. printFrameInfo(frame.as(FrameType.IR), mIrFps);
  3. // 释放frame资源
  4. frame.close();
  5. };
打印数据帧信息
  1. private void printFrameInfo(VideoFrame frame, int fps) {
  2. try {
  3. String frameInfo = "FrameType:" + frame.getStreamType()
  4. + ", index:" + frame.getFrameIndex()
  5. + ", width:" + frame.getWidth()
  6. + ", height:" + frame.getHeight()
  7. + ", format:" + frame.getFormat()
  8. + ", fps:" + fps
  9. + ", timeStampUs:" + frame.getTimeStampUs();
  10. if (frame.getStreamType() == FrameType.DEPTH) {
  11. frameInfo += ", middlePixelValue:" + getMiddlePixelValue(frame);
  12. }
  13. Log.i(TAG, frameInfo);
  14. } catch (Exception e) {
  15. e.printStackTrace();
  16. }
  17. }
初始化SDK,并监听设备变化
  1. // 实现监听设备变化
  2. private DeviceChangedCallback mDeviceChangedCallback new DeviceChangedCallback() {
  3. @Override
  4. public void onDeviceAttach(DeviceList deviceList) {
  5. try {
  6. if (deviceList == null || deviceList.getDeviceCount() <= 0) {
  7. setText(mNameTv, "No device connected !");
  8. }
  9. // 2.创建设备,并获取设备名称
  10. mDevice = deviceList.getDevice(0);
  11. DeviceInfo devInfo = mDevice.getInfo();
  12. String deviceName = devInfo.getName();
  13. setText(mNameTv, deviceName);
  14. devInfo.close();
  15. // 3.获取深度传感器
  16. mDepthSensor = mDevice.getSensor(SensorType.DEPTH);
  17. // 4.打开深度流,profile传入null,表示使用配置文件中配置的参数开流,
  18. // 如果设备中没有该配置,或不存在配置文件,则表示使用Profile列表中的第一个配置
  19. if (null != mDepthSensor) {
  20. mDepthSensor.start(null, mDepthFrameCallback);
  21. } else {
  22. Log.w(TAG, "onDeviceAttach: depth sensor is unsupported!");
  23. }
  24. // 5.获取彩色传感器
  25. mColorSensor = mDevice.getSensor(SensorType.COLOR);
  26. // 6.打开彩色流,profile传入null,表示使用配置文件中配置的参数开流,
  27. // 如果设备中没有该配置,或不存在配置文件,则表示使用Profile列表中的第一个配置
  28. if (null != mColorSensor) {
  29. mColorSensor.start(null, mColorFrameCallback);
  30. } else {
  31. Log.w(TAG, "onDeviceAttach: color sensor is unsupported!");
  32. }
  33. // 7.获取红外传感器
  34. mIrSensor = mDevice.getSensor(SensorType.IR);
  35. // 8.打开红外流,profile传入null,表示使用配置文件中配置的参数开流,
  36. // 如果设备中没有该配置,或不存在配置文件,则表示使用Profile列表中的第一个配置
  37. if (null != mIrSensor) {
  38. mIrSensor.start(null, mIrFrameCallback);
  39. } else {
  40. Log.w(TAG, "onDeviceAttach: ir sensor is unsupported!");
  41. }
  42. } catch (Exception e) {
  43. e.printStackTrace();
  44. } finally {
  45. // 9.更新开流配置信息
  46. setText(mProfileInfoTv, formatProfileInfo());
  47. // 10.释放deviceList资源
  48. deviceList.close();
  49. }
  50. }
  51. @Override
  52. public void onDeviceDetach(DeviceList deviceList) {
  53. try {
  54. setText(mNameTv, "No device connected !");
  55. setText(mProfileInfoTv, "");
  56. mDepthFps = 0;
  57. mColorFps = 0;
  58. mIrFps = 0;
  59. // 停止深度流
  60. if (null != mDepthSensor) {
  61. mDepthSensor.stop();
  62. }
  63. // 停止彩色流
  64. if (null != mColorSensor) {
  65. mColorSensor.stop();
  66. }
  67. // 停止红外流
  68. if (null != mIrSensor) {
  69. mIrSensor.stop();
  70. }
  71. // 释放Device
  72. if (null != mDevice) {
  73. mDevice.close();
  74. mDevice = null;
  75. }
  76. // 释放deviceList
  77. deviceList.close();
  78. } catch (Exception e) {
  79. e.printStackTrace();
  80. }
  81. }
  82. };
  83. // implementation BaseActivity#getDeviceChangedCallback()
  84. @Override
  85. protected DeviceChangedCallback getDeviceChangedCallback() {
  86. return mDeviceChangedCallback;
  87. }
资源释放
  1. try {
  2. // 停止深度流
  3. if (null != mDepthSensor) {
  4. mDepthSensor.stop();
  5. }
  6. // 停止彩色流
  7. if (null != mColorSensor) {
  8. mColorSensor.stop();
  9. }
  10. // 停止红外流
  11. if (null != mIrSensor) {
  12. mIrSensor.stop();
  13. }
  14. // 释放Device
  15. if (null != mDevice) {
  16. mDevice.close();
  17. }
  18. } catch (Exception e) {
  19. e.printStackTrace();
  20. }

IMU示例-IMU

功能描述:本示例主要演示了使用SDK获取IMU数据并输出显示。 定义IMU相关的sensor
  1. // 加速度计传感器
  2. private AccelFrame mAccelFrame;
  3. // 陀螺仪传感器
  4. private GyroFrame mGyroFrame;
监听设备变化
  1. // 实现监听设备变化
  2. private DeviceChangedCallback mDeviceChangedCallback = new DeviceChangedCallback() {
  3. @Override
  4. public void onDeviceAttach(DeviceList deviceList) {
  5. try {
  6. if (deviceList == null || deviceList.getDeviceCount() == 0) {
  7. showToast("请接入设备");
  8. } else {
  9. // 2.创建Device
  10. mDevice = deviceList.getDevice(0);
  11. // 3.通过Device获取加速度Sensor
  12. mSensorAccel = mDevice.getSensor(SensorType.ACCEL);
  13. // 4.通过Device获取陀螺仪Sensor
  14. mSensorGyro = mDevice.getSensor(SensorType.GYRO);
  15. if (mSensorAccel == null || mSensorGyro == null) {
  16. showToast("本设备不支持IMU");
  17. return;
  18. } else {
  19. runOnUiThread(() -> {
  20. mSurfaceViewImu.setVisibility(View.VISIBLE);
  21. });
  22. }
  23. if (mSensorAccel != null && mSensorGyro != null) {
  24. // 5.获取加速度计配置
  25. StreamProfileList accelProfileList = mSensorAccel.getStreamProfileList();
  26. if (null != accelProfileList) {
  27. mAccelStreamProfile = accelProfileList.getStreamProfile(0).as(StreamType.ACCEL);
  28. accelProfileList.close();
  29. }
  30. // 6.获取陀螺仪配置
  31. StreamProfileList gyroProfileList = mSensorGyro.getStreamProfileList();
  32. if (null != gyroProfileList) {
  33. mGyroStreamProfile = gyroProfileList.getStreamProfile(0).as(StreamType.GYRO);
  34. gyroProfileList.close();
  35. }
  36. }
  37. }
  38. } catch (Exception e) {
  39. e.printStackTrace();
  40. } finally {
  41. // 8.释放设备列表资源
  42. deviceList.close();
  43. }
  44. }
  45. @Override
  46. public void onDeviceDetach(DeviceList deviceList) {
  47. try {
  48. showToast("请接入设备");
  49. deviceList.close();
  50. } catch (Exception e) {
  51. e.printStackTrace();
  52. }
  53. }
  54. };
  55. // implementation BaseActivity#getDeviceChangedCallback()
  56. @Override
  57. protected DeviceChangedCallback getDeviceChangedCallback() {
  58. return mDeviceChangedCallback;
  59. }
创建设备,并通过设备获取加速度计传感器和陀螺仪传感器
  1. // 2.创建Device
  2. mDevice = deviceList.getDevice(0);
  3. // 3.通过Device获取加速度Sensor
  4. mSensorAccel = mDevice.getSensor(SensorType.ACCEL);
  5. // 4.通过Device获取陀螺仪Sensor
  6. mSensorGyro = mDevice.getSensor(SensorType.GYRO);
  7. if (mSensorAccel == null || mSensorGyro == null) {
  8. showToast("本设备不支持IMU");
  9. deviceList.close();
  10. return;
  11. } else {
  12. runOnUiThread(() -> {
  13. mSurfaceViewImu.setVisibility(View.VISIBLE);
  14. });
  15. }
获取加速度计开流配置以及陀螺仪开流配置
  1. if (mSensorAccel != null && mSensorGyro != null) {
  2. // 5.获取加速度计配置
  3. StreamProfileList accelProfileList = mSensorAccel.getStreamProfileList();
  4. if (null != accelProfileList) {
  5. mAccelStreamProfile = accelProfileList.getStreamProfile(0).as(StreamType.ACCEL);
  6. accelProfileList.close();
  7. }
  8. // 6.获取陀螺仪配置
  9. StreamProfileList gyroProfileList = mSensorGyro.getStreamProfileList();
  10. if (null != gyroProfileList) {
  11. mGyroStreamProfile = gyroProfileList.getStreamProfile(0).as(StreamType.GYRO);
  12. gyroProfileList.close();
  13. }
  14. }
通过指定配置开流
  1. private void startIMU() {
  2. // 7.1.初始化IMU数据刷新线程
  3. mIsRefreshIMUDataRunning = true;
  4. mRefreshIMUDataThread = new Thread(mRefreshIMUDataRunnable);
  5. mRefreshIMUDataThread.setName("RefreshIMUDataThread");
  6. // 7.2.开启IMU数据刷新线程
  7. mRefreshIMUDataThread.start();
  8. // 7.3.开始陀螺仪采样
  9. startGyroStream();
  10. // 7.4.开始加速度计采样
  11. startAccelStream();
  12. }
开启加速度计采样
  1. private void startAccelStream() {
  2. try {
  3. // 开启加速度计采样
  4. if (null != mAccelStreamProfile) {
  5. mSensorAccel.start(mAccelStreamProfile, new FrameCallback() {
  6. @Override
  7. public void onFrame(Frame frame) {
  8. AccelFrame accelFrame = frame.as(FrameType.ACCEL);
  9. Log.d(TAG, "AccelFrame onFrame");
  10. synchronized (mAccelLock) {
  11. if (null != mAccelFrame) {
  12. mAccelFrame.close();
  13. mAccelFrame = null;
  14. }
  15. mAccelFrame = accelFrame;
  16. }
  17. }
  18. });
  19. mIsAccelStarted = true;
  20. }
  21. } catch (Exception e) {
  22. e.printStackTrace();
  23. }
  24. }
开启陀螺仪采样
  1. private void startGyroStream() {
  2. try {
  3. // 开启陀螺仪采样
  4. if (null != mGyroStreamProfile) {
  5. mSensorGyro.start(mGyroStreamProfile, new FrameCallback() {
  6. @Override
  7. public void onFrame(Frame frame) {
  8. GyroFrame gyroFrame = frame.as(FrameType.GYRO);
  9. Log.d(TAG, "GyroFrame onFrame");
  10. synchronized (mGyroLock) {
  11. if (null != mGyroFrame) {
  12. mGyroFrame.close();
  13. mGyroFrame = null;
  14. }
  15. mGyroFrame = gyroFrame;
  16. }
  17. }
  18. });
  19. mIsGyroStarted = true;
  20. }
  21. } catch (Exception e) {
  22. e.printStackTrace();
  23. }
  24. }
创建AccelFrame和GyroFrame数据绘制方法,并在数据刷新线程实时绘制
  1. private void drawImuInfo() {
  2. if (mIsAccelStarted || mIsGyroStarted) {
  3. synchronized (mAccelLock) {
  4. if (null != mAccelFrame) {
  5. mAccelTimeStampView.setText("AccelTimestamp:" + mAccelFrame.getTimeStamp());
  6. mAccelTemperatureView.setText(String.format("AccelTemperature:%.2f°C", mAccelFrame.getTemperature()));
  7. mAccelXView.setText(String.format("Accel.x: %.6fm/s^2", mAccelFrame.getAccelData()[0]));
  8. mAccelYView.setText(String.format("Accel.x:%.6fm/s^2", mAccelFrame.getAccelData()[1]));
  9. mAccelZView.setText(String.format("Accel.x: %.6fm/s^2", mAccelFrame.getAccelData()[2]));
  10. } else {
  11. mAccelTimeStampView.setText("AccelTimestamp: null");
  12. mAccelTemperatureView.setText("AccelTemperature: null");
  13. mAccelXView.setText("Accel.x: null");
  14. mAccelYView.setText("Accel.x: null");
  15. mAccelZView.setText("Accel.x: null");
  16. }
  17. }
  18. synchronized (mGyroLock) {
  19. if (null != mGyroFrame) {
  20. mGyroTimeStampView.setText("GyroTimestamp:" + mGyroFrame.getTimeStamp());
  21. mGyroTemperatureView.setText(String.format("GyroTemperature:%.2f°C", mGyroFrame.getTemperature()));
  22. mGyroXView.setText(String.format("Gyro.x: %.6frad/s", mGyroFrame.getGyroData()[0]));
  23. mGyroYView.setText(String.format("Gyro.x:%.6frad/s", mGyroFrame.getGyroData()[1]));
  24. mGyroZView.setText(String.format("Gyro.x: %.6frad/s", mGyroFrame.getGyroData()[2]));
  25. } else {
  26. mGyroTimeStampView.setText("GyroTimestamp: null");
  27. mGyroTemperatureView.setText("GyroTemperature: null");
  28. mGyroXView.setText("Gyro.x: null");
  29. mGyroYView.setText("Gyro.x: null");
  30. mGyroZView.setText("Gyro.x: null");
  31. }
  32. }
  33. } // if accel or gyro started
  34. }
停止数据采样
  1. private void stopIMU() {
  2. try {
  3. // 停止加速度计采样
  4. if (null != mSensorAccel) {
  5. mSensorAccel.stop();
  6. }
  7. // 停止陀螺仪采样
  8. if (null != mSensorGyro) {
  9. mSensorGyro.stop();
  10. }
  11. // 停止IMU数据刷新线程并释放
  12. mIsRefreshIMUDataRunning = false;
  13. if (null != mRefreshIMUDataThread) {
  14. try {
  15. mRefreshIMUDataThread.join(300);
  16. } catch (InterruptedException e) {
  17. e.printStackTrace();
  18. }
  19. mRefreshIMUDataThread = null;
  20. }
  21. } catch (Exception e) {
  22. e.printStackTrace();
  23. }
  24. }
资源释放
  1. try {
  2. // 释放加速度计配置
  3. if (null != mAccelStreamProfile) {
  4. mAccelStreamProfile.close();
  5. mAccelStreamProfile = null;
  6. }
  7. // 释放陀螺仪配置
  8. if (null != mGyroStreamProfile) {
  9. mGyroStreamProfile.close();
  10. mGyroStreamProfile = null;
  11. }
  12. // 释放Device
  13. if (null != mDevice) {
  14. mDevice.close();
  15. mDevice = null;
  16. }
  17. } catch (Exception e) {
  18. e.printStackTrace();
  19. }

多设备示例-MultiDevice

功能描述:本示例主要演示了对多设备进行操作。 首先需要创建一个Context,用于获取设备信息列表和创建设备
  1. private OBContext mOBContext;
监听设备变化
  1. // 实现监听设备变化
  2. private DeviceChangedCallback mDeviceChangedCallback = new DeviceChangedCallback() {
  3. @Override
  4. public void onDeviceAttach(DeviceList deviceList) {
  5. try {
  6. int count = deviceList.getDeviceCount();
  7. for (int i = 0; i < count; i++) {
  8. // 创建设备
  9. Device device = deviceList.getDevice(i);
  10. // 获取DeviceInfo
  11. DeviceInfo devInfo = device.getInfo();
  12. // 获取设备名称
  13. String name = devInfo.getName();
  14. // 获取设备uid
  15. String uid = devInfo.getUid();
  16. // 获取设备连接类型
  17. String connectionType = devInfo.getConnectionType();
  18. // 释放DeviceInfo资源
  19. devInfo.close();
  20. runOnUiThread(() -> {
  21. mDeviceControllerAdapter.addItem(new DeviceBean(name, uid, connectionType, device));
  22. });
  23. }
  24. } catch (Exception e) {
  25. e.printStackTrace();
  26. } finally {
  27. // 释放设备列表资源
  28. deviceList.close();
  29. }
  30. }
  31. @Override
  32. public void onDeviceDetach(DeviceList deviceList) {
  33. try {
  34. for (DeviceBean deviceBean : mDeviceBeanList) {
  35. // 通过uid判断下线设备
  36. if (deviceBean.getDeviceUid().equals(deviceList.getUid(0))) {
  37. // 释放下线设备资源
  38. deviceBean.getDevice().close();
  39. runOnUiThread(() -> {
  40. mDeviceControllerAdapter.deleteItem(deviceBean);
  41. });
  42. }
  43. }
  44. } catch (Exception e) {
  45. Log.w(TAG, "onDeviceDetach: " + e.getMessage());
  46. } finally {
  47. // 释放设备列表资源
  48. deviceList.close();
  49. }
  50. }
  51. };
在设备连接回调方法中创建设备列表
  1. int count = deviceList.getDeviceCount();
  2. for (int i = 0; i < count; i++) {
  3. // 创建设备
  4. Device device = deviceList.getDevice(i);
  5. // 获取DeviceInfo
  6. DeviceInfo devInfo = device.getInfo();
  7. // 获取设备名称
  8. String name = devInfo.getName();
  9. // 获取设备uid
  10. String uid = devInfo.getUid();
  11. // 获取设备连接类型
  12. String connectionType = devInfo.getConnectionType();
  13. // 释放DeviceInfo资源
  14. devInfo.close();
  15. runOnUiThread(() -> {
  16. mDeviceControllerAdapter.addItem(new DeviceBean(name, uid, connectionType, device));
  17. });
  18. }
选择对应的设备开流
  1. private void startStream(Sensor sensor, OBGLView glView) {
  2. if (null == sensor) {
  3. return;
  4. }
  5. try {
  6. // 获取传感器的流配置列表
  7. StreamProfileList profileList = sensor.getStreamProfileList();
  8. if (null == profileList) {
  9. Log.w(TAG, "start stream failed, profileList is null !");
  10. return;
  11. }
  12. switch (sensor.getType()) {
  13. case DEPTH:
  14. OBGLView depthGLView = glView;
  15. // 通过StreamProfileList获取开流配置
  16. StreamProfile depthProfile = getVideoStreamProfile(profileList, 640, 0, Format.UNKNOWN, 30);
  17. if (null != depthProfile) {
  18. printStreamProfile(depthProfile.as(StreamType.VIDEO));
  19. // 通过指定配置开流
  20. sensor.start(depthProfile, frame -> {
  21. DepthFrame depthFrame = frame.as(FrameType.DEPTH);
  22. byte[] bytes = new byte[depthFrame.getDataSize()];
  23. depthFrame.getData(bytes);
  24. // 渲染数据
  25. depthGLView.update(depthFrame.getWidth(), depthFrame.getHeight(),
  26. StreamType.DEPTH, depthFrame.getFormat(), bytes, depthFrame.getValueScale());
  27. // 释放frame资源
  28. frame.close();
  29. });
  30. // 释放profile资源
  31. depthProfile.close();
  32. } else {
  33. Log.w(TAG, "start depth stream failed, depthProfile is null!");
  34. }
  35. break;
  36. case COLOR:
  37. OBGLView colorGLView = glView;
  38. // 通过StreamProfileList获取开流配置
  39. StreamProfile colorProfile = getVideoStreamProfile(profileList, 640, 0, Format.RGB888, 30);
  40. if (null == colorProfile) {
  41. colorProfile = getVideoStreamProfile(profileList, 640, 0, Format.UNKNOWN, 30);
  42. }
  43. if (null != colorProfile) {
  44. printStreamProfile(colorProfile.as(StreamType.VIDEO));
  45. // 通过指定配置开流
  46. sensor.start(colorProfile, frame -> {
  47. ColorFrame colorFrame = frame.as(FrameType.COLOR);
  48. byte[] bytes = new byte[colorFrame.getDataSize()];
  49. // 获取frame数据
  50. colorFrame.getData(bytes);
  51. // 渲染数据
  52. colorGLView.update(colorFrame.getWidth(), colorFrame.getHeight(), StreamType.COLOR, colorFrame.getFormat(), bytes, 1.0f);
  53. // 释放frame资源
  54. frame.close();
  55. });
  56. // 释放profile资源
  57. colorProfile.close();
  58. } else {
  59. Log.w(TAG, "start color stream failed, colorProfile is null!");
  60. }
  61. break;
  62. case IR:
  63. OBGLView irGLView = glView;
  64. // 通过StreamProfileList获取开流配置
  65. StreamProfile irProfile = getVideoStreamProfile(profileList, 640, 0, Format.UNKNOWN, 30);
  66. if (null != irProfile) {
  67. printStreamProfile(irProfile.as(StreamType.VIDEO));
  68. // 通过指定配置开流
  69. sensor.start(irProfile, frame -> {
  70. IRFrame irFrame = frame.as(FrameType.IR);
  71. byte[] bytes = new byte[irFrame.getDataSize()];
  72. // 获取frame数据
  73. irFrame.getData(bytes);
  74. // 渲染数据
  75. irGLView.update(irFrame.getWidth(), irFrame.getHeight(),
  76. StreamType.IR, irFrame.getFormat(), bytes, 1.0f);
  77. // frame资源
  78. frame.close();
  79. });
  80. // 释放profile资源
  81. irProfile.close();
  82. } else {
  83. Log.w(TAG, "start ir stream failed, irProfile is null!");
  84. }
  85. break;
  86. }
  87. // 释放profileList资源
  88. profileList.close();
  89. } catch (Exception e) {
  90. Log.w(TAG, "startStream: " + e.getMessage());
  91. }
  92. }
对指定的设备关流指定sensor流
  1. private void stopStream(Sensor sensor) {
  2. if (null == sensor) {
  3. return;
  4. }
  5. try {
  6. sensor.stop();
  7. } catch (Exception e) {
  8. e.printStackTrace();
  9. }
  10. }
在设备断开连接回调中做对应的资源释放并刷新设备列表
  1. try {
  2. for (DeviceBean deviceBean : mDeviceBeanList) {
  3. // 通过uid判断下线设备
  4. if (deviceBean.getDeviceUid().equals(deviceList.getUid(0))) {
  5. // 释放下线设备资源
  6. deviceBean.getDevice().close();
  7. runOnUiThread(() -> {
  8. mDeviceControllerAdapter.deleteItem(deviceBean);
  9. });
  10. }
  11. }
  12. } catch (Exception e) {
  13. Log.w(TAG, "onDeviceDetach: " + e.getMessage());
  14. } finally {
  15. // 释放设备列表资源
  16. deviceList.close();
  17. }
资源释放
  1. try {
  2. // 释放资源
  3. for (DeviceBean deviceBean : mDeviceBeanList) {
  4. try {
  5. // 释放设备资源
  6. deviceBean.getDevice().close();
  7. } catch (Exception e) {
  8. Log.w(TAG, "onDestroy: " + e.getMessage());
  9. }
  10. }
  11. mDeviceBeanList.clear();
  12. // 释放SDK
  13. if (null != mOBContext) {
  14. mOBContext.close();
  15. }
  16. } catch (Exception e) {
  17. e.printStackTrace();
  18. }

点云示例-PointCloud

功能描述:本示例主要演示了连接设备开流 ,生成深度点云或RGBD点云并保存成ply格式文件。 在基础模块BaseActivity中,我们介绍了genD2CStreamProfile(),流对齐需要同时配置depth和color的分辨率,以及对齐类型,实现方法如下:
  1. private Config genD2CConfig(Pipeline pipeline, AlignMode alignMode) {
  2. // Get D2CStreamProfile BaseActivity#genD2CStreamProfile(Pipeline, AlignMode)
  3. D2CStreamProfile d2CStreamProfile = genD2CStreamProfile(pipeline, alignMode);
  4. if (null == d2CStreamProfile) {
  5. return null;
  6. }
  7. // Update color information to UI
  8. VideoStreamProfile colorProfile = d2CStreamProfile.getColorProfile();
  9. // Update depth information to UI
  10. VideoStreamProfile depthProfile = d2CStreamProfile.getDepthProfile();
  11. Config config = new Config();
  12. // set D2C AlignMode
  13. config.setAlignMode(alignMode);
  14. config.enableStream(d2CStreamProfile.getColorProfile());
  15. config.enableStream(d2CStreamProfile.getDepthProfile());
  16. d2CStreamProfile.close();
  17. return config;
  18. }
监听设备变化
  1. // 实现监听设备变化
  2. private DeviceChangedCallback mDeviceChangedCallback = new DeviceChangedCallback() {
  3. @Override
  4. public void onDeviceAttach(DeviceList deviceList) {
  5. try {
  6. if (null == mPipeline) {
  7. // 2.Create Device and initialize Pipeline through Device
  8. mDevice = deviceList.getDevice(0);
  9. if (null == mDevice.getSensor(SensorType.COLOR)) {
  10. mDevice.close();
  11. mDevice = null;
  12. showToast(getString(R.string.device_not_support_color));
  13. runOnUiThread(() -> {
  14. mSaveColorPointsBtn.setEnabled(false);
  15. });
  16. }
  17. if (null == mDevice.getSensor(SensorType.DEPTH)) {
  18. mDevice.close();
  19. mDevice = null;
  20. showToast(getString(R.string.device_not_support_depth));
  21. return;
  22. }
  23. // 3.Create Device and initialize Pipeline through Device
  24. mPipeline = new Pipeline(mDevice);
  25. // 4.Create Config to configure pipeline opening sensors
  26. Config config = genD2CConfig(mPipeline, AlignMode.ALIGN_D2C_HW_ENABLE);
  27. if (null == config) {
  28. mPipeline.close();
  29. mPipeline = null;
  30. mDevice.close();
  31. mDevice = null;
  32. Log.w(TAG, "onDeviceAttach: No target depth and color stream profile!");
  33. showToast(getString(R.string.init_stream_profile_failed));
  34. return;
  35. }
  36. // 5.Start sensors stream
  37. mPipeline.start(config, mPointCloudFrameSetCallback);
  38. // 6.Start the point cloud asynchronous processing thread
  39. start();
  40. // 7.Create point cloud filter
  41. mPointCloudFilter = new PointCloudFilter();
  42. // 8.Set the format of the point cloud filter
  43. mPointCloudFilter.setPointFormat(mPointFormat);
  44. // 9.Obtain camera intrinsic parameters and set parameters to point cloud filter
  45. CameraParam cameraParam = mPipeline.getCameraParam();
  46. mPointCloudFilter.setCameraParam(cameraParam);
  47. Log.i(TAG, "onDeviceAttach: cameraParam:" + cameraParam);
  48. // 10.Release config resources
  49. config.close();
  50. }
  51. } catch (Exception e) {
  52. e.printStackTrace();
  53. } finally {
  54. // 16.释放设备列表资源
  55. deviceList.close();
  56. }
  57. }
  58. @Override
  59. public void onDeviceDetach(DeviceList deviceList) {
  60. try {
  61. deviceList.close();
  62. } catch (Exception e) {
  63. e.printStackTrace();
  64. }
  65. }
  66. };
创建Pipeline开流的回调方法
  1. // Pipeline开流的回调
  2. private FrameSetCallback mPointCloudFrameSetCallback = frameSet -> {
  3. if (null != frameSet) {
  4. if (mIsPointCloudRunning) {
  5. if (null == mPointFrameSet) {
  6. mPointFrameSet = frameSet;
  7. return;
  8. }
  9. }
  10. frameSet.close();
  11. }
  12. };
点云filter线程数据处理
  1. while (mIsPointCloudRunning) {
  2. try {
  3. if (null != mPointFrameSet) {
  4. Frame frame = null;
  5. if (mPointFormat == Format.POINT) {
  6. // 设置保存格式为深度点云
  7. mPointCloudFilter.setPointFormat(Format.POINT);
  8. } else {
  9. // 设置保存格式为彩色点云
  10. mPointCloudFilter.setPointFormat(Format.RGB_POINT);
  11. }
  12. // 点云保存时要注意深度进度单位
  13. DepthFrame depthFrame = mPointFrameSet.getDepthFrame();
  14. if (null != depthFrame) {
  15. // 更新点云的深度进度单位
  16. mPointCloudFilter.setPositionDataScale(depthFrame.getValueScale());
  17. depthFrame.close();
  18. }
  19. // 点云filter处理生成对应的点云数据
  20. frame = mPointCloudFilter.process(mPointFrameSet);
  21. if (null != frame) {
  22. // 获取点云帧
  23. PointFrame pointFrame = frame.as(FrameType.POINTS);
  24. if (mIsSavePoints) {
  25. if (mPointFormat == Format.POINT) {
  26. // 获取深度点云数据并保存,深度点云的数据大小为w * h * 3
  27. float[] depthPoints = new float[pointFrame.getDataSize() / Float.BYTES];
  28. pointFrame.getPointCloudData(depthPoints);
  29. String depthPointsPath = mSdcardDir.toString() + "/Orbbec/point.ply";
  30. FileUtils.savePointCloud(depthPointsPath, depthPoints);
  31. runOnUiThread(new Runnable() {
  32. @Override
  33. public void run() {
  34. mInfoTv.append("Save Path:" + depthPointsPath + "\n");
  35. }
  36. });
  37. } else {
  38. // 获取彩色点云数据并保存,彩色点云的数据大小为w * h * 6
  39. float[] colorPoints = new float[pointFrame.getDataSize() / Float.BYTES];
  40. pointFrame.getPointCloudData(colorPoints);
  41. String colorPointsPath = mSdcardDir.toString() + "/Orbbec/point_rgb.ply";
  42. FileUtils.saveRGBPointCloud(colorPointsPath, colorPoints);
  43. runOnUiThread(new Runnable() {
  44. @Override
  45. public void run() {
  46. mInfoTv.append("Save Path:" + colorPointsPath + "\n");
  47. }
  48. });
  49. }
  50. mIsSavePoints = false;
  51. }
  52. // 释放新生成的frame
  53. frame.close();
  54. }
  55. // 释放原始数据frameSet
  56. mPointFrameSet.close();
  57. mPointFrameSet = null;
  58. }
  59. } catch (Exception e) {
  60. e.printStackTrace();
  61. }
  62. }
保存深度点云数据,请参考OrbbecSdkExamples#PointCloudActivity.java
  1. public static void savePointCloud(String fileName, float[] data)
保存彩色点云数据,请参考OrbbecSdkExamples#PointCloudActivity.java
  1. public static void saveRGBPointCloud(String fileName, float[] data)
资源释放
  1. try {
  2. // 停止Pipeline,并关闭
  3. if (null != mPipeline) {
  4. mPipeline.stop();
  5. mPipeline.close();
  6. mPipeline = null;
  7. }
  8. // 释放点云filter
  9. if (null != mPointCloudFilter) {
  10. try {
  11. mPointCloudFilter.close();
  12. } catch (Exception e) {
  13. }
  14. mPointCloudFilter = null;
  15. }
  16. // 释放Device
  17. if (mDevice != null) {
  18. mDevice.close();
  19. mDevice = null;
  20. }
  21. } catch (Exception e) {
  22. e.printStackTrace();
  23. }

存储示例-SaveToDisk

功能描述:本示例用于演示连接设备开流 , 获取彩色存储为png格式,深度存储为raw格式以及通过filter进行格式转换。 首先需要创建一个Context,用于获取设备信息列表和创建设备
  1. private OBContext mOBContext;
监听设备变化
  1. // 实现监听设备变化
  2. private DeviceChangedCallback mDeviceChangedCallback = new DeviceChangedCallback() {
  3. @Override
  4. public void onDeviceAttach(DeviceList deviceList) {
  5. try {
  6. if (null == mPipeline) {
  7. // 2.Create Device and initialize Pipeline through Device
  8. mDevice = deviceList.getDevice(0);
  9. if (null == mDevice.getSensor(SensorType.DEPTH)) {
  10. depthCount = 5;
  11. showToast(getString(R.string.device_not_support_depth));
  12. }
  13. if (null == mDevice.getSensor(SensorType.COLOR)) {
  14. colorCount = 5;
  15. showToast(getString(R.string.device_not_support_color));
  16. }
  17. mPipeline = new Pipeline(mDevice);
  18. // 3.Initialize the format conversion filter
  19. if (null != mFormatConvertFilter) {
  20. mFormatConvertFilter = new FormatConvertFilter();
  21. }
  22. // 4.Create Pipeline configuration
  23. Config config = new Config();
  24. // 5.Get the color Sensor VideoStreamProfile and configure it to Config
  25. try {
  26. VideoStreamProfile colorStreamProfile = getStreamProfile(mPipeline, SensorType.COLOR);
  27. // 6.Enable color sensor through the obtained color sensor configuration
  28. if (null != colorStreamProfile) {
  29. printStreamProfile(colorStreamProfile.as(StreamType.VIDEO));
  30. config.enableStream(colorStreamProfile);
  31. colorStreamProfile.close();
  32. } else {
  33. Log.w(TAG, "onDeviceAttach: No target color stream profile!");
  34. }
  35. } catch (Exception e) {
  36. e.printStackTrace();
  37. }
  38. // 7.Get the depth sensor configuration and configure it to Config
  39. try {
  40. VideoStreamProfile depthStreamProfile = getStreamProfile(mPipeline, SensorType.DEPTH);
  41. // 8.Enable depth sensor through the obtained depth sensor configuration
  42. if (null != depthStreamProfile) {
  43. printStreamProfile(depthStreamProfile.as(StreamType.VIDEO));
  44. config.enableStream(depthStreamProfile);
  45. depthStreamProfile.close();
  46. } else {
  47. Log.w(TAG, "onDeviceAttach: No target depth stream profile!");
  48. }
  49. } catch (Exception e) {
  50. e.printStackTrace();
  51. }
  52. initSaveImageDir();
  53. // 9.Open Pipeline with Config
  54. mPipeline.start(config);
  55. // 10.Release config resources
  56. config.close();
  57. // 11.Create a pipeline data acquisition thread and a picture saving thread
  58. start();
  59. }
  60. } catch (Exception e) {
  61. e.printStackTrace();
  62. } finally {
  63. // 12.Release DeviceList
  64. deviceList.close();
  65. }
  66. }
  67. @Override
  68. public void onDeviceDetach(DeviceList deviceList) {
  69. try {
  70. if (mDevice != null) {
  71. for (int i = 0, N = deviceList.getDeviceCount(); i < N; i++) {
  72. String uid = deviceList.getUid(i);
  73. DeviceInfo deviceInfo = mDevice.getInfo();
  74. if (null != deviceInfo && TextUtils.equals(uid, deviceInfo.getUid())) {
  75. stop();
  76. mPipeline.stop();
  77. mPipeline.close();
  78. mPipeline = null;
  79. mDevice.close();
  80. mDevice = null;
  81. }
  82. deviceInfo.close();
  83. }
  84. }
  85. } catch (Exception e) {
  86. e.printStackTrace();
  87. } finally {
  88. try {
  89. deviceList.close();
  90. } catch (Exception ignore) {
  91. }
  92. }
  93. }
  94. };
创建获取Pipeline数据线程以及图片保存线程
  1. private void start() {
  2. colorCount = 0;
  3. depthCount = 0;
  4. mIsStreamRunning = true;
  5. mIsPicSavingRunning = true;
  6. if (null == mStreamThread) {
  7. mStreamThread = new Thread(mStreamRunnable);
  8. mStreamThread.start();
  9. }
  10. if (null == mPicSavingThread) {
  11. mPicSavingThread = new Thread(mPicSavingRunnable);
  12. mPicSavingThread.start();
  13. }
  14. }
数据流处理
  1. int count = 0;
  2. ByteBuffer colorSrcBuf = null;
  3. ByteBuffer colorDstBuf = null;
  4. while (mIsStreamRunning) {
  5. try {
  6. // 等待100ms后如果获取不到,则超时
  7. FrameSet frameSet = mPipeline.waitForFrameSet(100);
  8. if (null == frameSet) {
  9. continue;
  10. }
  11. if (count < 5) {
  12. frameSet.close();
  13. count++;
  14. continue;
  15. }
  16. // 获取彩色流数据
  17. ColorFrame colorFrame = frameSet.getColorFrame();
  18. if (null != colorFrame) {
  19. Frame rgbFrame = null;
  20. switch (colorFrame.getFormat()) {
  21. case MJPG:
  22. mFormatConvertFilter.setFormatType(FormatConvertType.FORMAT_MJPEG_TO_RGB888);
  23. rgbFrame = mFormatConvertFilter.process(colorFrame);
  24. break;
  25. case YUYV:
  26. mFormatConvertFilter.setFormatType(FormatConvertType.FORMAT_YUYV_TO_RGB888);
  27. rgbFrame = mFormatConvertFilter.process(colorFrame);
  28. break;
  29. case UYVY:
  30. FrameCopy frameCopy = copyToFrameT(colorFrame);
  31. if (null == colorSrcBuf || colorSrcBuf.capacity() != colorFrame.getDataSize()) {
  32. colorSrcBuf = ByteBuffer.allocateDirect(colorFrame.getDataSize());
  33. }
  34. colorSrcBuf.clear();
  35. colorFrame.getData(colorSrcBuf);
  36. int colorDstSize = colorFrame.getWidth() * colorFrame.getHeight() * 3;
  37. if (null == colorDstBuf || colorDstBuf.capacity() != colorDstSize) {
  38. colorDstBuf = ByteBuffer.allocateDirect(colorDstSize);
  39. }
  40. colorDstBuf.clear();
  41. ImageUtils.uyvyToRgb(colorSrcBuf, colorDstBuf, colorFrame.getWidth(), colorFrame.getHeight());
  42. frameCopy.data = new byte[colorDstSize];
  43. colorDstBuf.get(frameCopy.data);
  44. mFrameSaveQueue.offer(frameCopy);
  45. break;
  46. default:
  47. Log.w(TAG, "Unsupported color format!");
  48. break;
  49. }
  50. if (null != rgbFrame) {
  51. FrameCopy frameCopy = copyToFrameT(rgbFrame.as(FrameType.VIDEO));
  52. mFrameSaveQueue.offer(frameCopy);
  53. rgbFrame.close();
  54. }
  55. colorFrame.close();
  56. }
  57. // 获取深度流数据
  58. DepthFrame depthFrame = frameSet.getDepthFrame();
  59. if (null != depthFrame) {
  60. FrameCopy frameT = copyToFrameT(depthFrame);
  61. mFrameSaveQueue.offer(frameT);
  62. depthFrame.close();
  63. }
  64. // 释放数据集
  65. frameSet.close();
  66. } catch (Exception e) {
  67. e.printStackTrace();
  68. }
  69. }
数据保存
  1. while (mIsPicSavingRunning) {
  2. try {
  3. FrameCopy frameT = mFrameSaveQueue.poll(300, TimeUnit.MILLISECONDS);
  4. if (null != frameT) {
  5. Log.d(TAG, "colorCount :" + colorCount);
  6. if (frameT.getStreamType() == FrameType.COLOR && colorCount < 5) {
  7. FileUtils.saveImage(frameT);
  8. colorCount++;
  9. }
  10. Log.d(TAG, "depthCount :" + depthCount);
  11. if (frameT.getStreamType() == FrameType.DEPTH && depthCount < 5) {
  12. FileUtils.saveImage(frameT);
  13. depthCount++;
  14. }
  15. }
  16. } catch (Exception e) {
  17. }
  18. if (colorCount == 5 && depthCount == 5) {
  19. mIsPicSavingRunning = false;
  20. break;
  21. }
  22. }
  23. mFrameSaveQueue.clear();
退出数据处理线程和存图线程
  1. private void stop() {
  2. mIsStreamRunning = false;
  3. mIsPicSavingRunning = false;
  4. if (null != mStreamThread) {
  5. try {
  6. mStreamThread.join(1000);
  7. } catch (InterruptedException e) {
  8. }
  9. mStreamThread = null;
  10. }
  11. if (null != mPicSavingThread) {
  12. try {
  13. mPicSavingThread.join(1000);
  14. } catch (InterruptedException e) {
  15. }
  16. mPicSavingThread = null;
  17. }
  18. }
资源释放
  1. try {
  2. // 释放filter资源
  3. if (null != mFormatConvertFilter) {
  4. try {
  5. mFormatConvertFilter.close();
  6. } catch (Exception e) {
  7. e.printStackTrace();
  8. }
  9. }
  10. // 停止Pipeline,并关闭
  11. if (null != mPipeline) {
  12. mPipeline.stop();
  13. mPipeline.close();
  14. mPipeline = null;
  15. }
  16. // 释放Device资源
  17. if (null != mDevice) {
  18. mDevice.close();
  19. mDevice = null;
  20. }
  21. } catch (Exception e) {
  22. e.printStackTrace();
  23. }

传感器控制示例-SensorControl

功能描述:本示例主要演示了对device控制命令的操作、对Sensor控制命令的操作。 属性操作请阅读OrbbecSdkExamples代码SensorControlActivity.java

录制与回放示例-Recorder & Playback

功能描述: :连接设备开流 , 录制当前视频流到文件,载入视频文件进行回放。 监听设备变化
  1. private DeviceChangedCallback mDeviceChangedCallback = new DeviceChangedCallback() {
  2. @Override
  3. public void onDeviceAttach(DeviceList deviceList) {
  4. try {
  5. if (null == mPipeline) {
  6. // 2.Create Device and initialize Pipeline through Device
  7. mDevice = deviceList.getDevice(0);
  8. if (null == mDevice.getSensor(SensorType.DEPTH)) {
  9. showToast(getString(R.string.device_not_support_depth));
  10. return;
  11. }
  12. mPipeline = new Pipeline(mDevice);
  13. // 3.Update device information view
  14. updateDeviceInfoView(false);
  15. // 4.Create Pipeline configuration
  16. mConfig = new Config();
  17. // 5.Get the depth flow configuration and configure it to Config
  18. VideoStreamProfile streamProfile = getStreamProfile(mPipeline, SensorType.DEPTH);
  19. // 6.Enable deep sensor configuration
  20. if (null != streamProfile) {
  21. printStreamProfile(streamProfile.as(StreamType.VIDEO));
  22. mConfig.enableStream(streamProfile);
  23. streamProfile.close();
  24. } else {
  25. mDevice.close();
  26. mDevice = null;
  27. mPipeline.close();
  28. mPipeline = null;
  29. mConfig.close();
  30. mConfig = null;
  31. Log.w(TAG, "onDeviceAttach: No target stream profile!");
  32. showToast(getString(R.string.init_stream_profile_failed));
  33. return;
  34. }
  35. // 7.Start pipeline
  36. mPipeline.start(mConfig);
  37. // 8.Create a thread to obtain Pipeline data
  38. start();
  39. runOnUiThread(() -> {
  40. mStartRecordBtn.setEnabled(true);
  41. });
  42. }
  43. } catch (Exception e) {
  44. e.printStackTrace();
  45. } finally {
  46. // 9.Release DeviceList
  47. deviceList.close();
  48. }
  49. }
  50. @Override
  51. public void onDeviceDetach(DeviceList deviceList) {
  52. try {
  53. release();
  54. runOnUiThread(() -> {
  55. mStartRecordBtn.setEnabled(false);
  56. mStopRecordBtn.setEnabled(false);
  57. mStartPlaybackBtn.setEnabled(isPlayFileValid());
  58. mStopPlaybackBtn.setEnabled(false);
  59. });
  60. } catch (Exception e) {
  61. e.printStackTrace();
  62. } finally {
  63. deviceList.close();
  64. }
  65. }
  66. };
录制实现,开始录制与停止录制
  1. /**
  2. * Start recording
  3. */
  4. private void startRecord() {
  5. try {
  6. if (!mIsRecording) {
  7. if (null != mPipeline) {
  8. // Start recording
  9. mPipeline.startRecord(mRecordFilePath);
  10. mIsRecording = true;
  11. } else {
  12. Log.w(TAG, "mPipeline is null !");
  13. }
  14. }
  15. } catch (Exception e) {
  16. e.printStackTrace();
  17. }
  18. }
  19. /**
  20. * stop recording
  21. */
  22. private void stopRecord() {
  23. try {
  24. if (mIsRecording) {
  25. mIsRecording = false;
  26. if (null != mPipeline) {
  27. // stop recording
  28. mPipeline.stopRecord();
  29. }
  30. }
  31. } catch (Exception e) {
  32. e.printStackTrace();
  33. }
  34. }
回放实现,开始回放与停止回放
  1. /**
  2. * Start playback
  3. */
  4. private void startPlayback() {
  5. try {
  6. if (!FileUtils.isFileExists(mRecordFilePath)) {
  7. Toast.makeText(RecordPlaybackActivity.this, "File not found!", Toast.LENGTH_LONG).show();
  8. return;
  9. }
  10. if (!mIsPlaying) {
  11. mIsPlaying = true;
  12. // Release Playback resources
  13. if (null != mPlayback) {
  14. mPlayback.close();
  15. mPlayback = null;
  16. }
  17. // Create a playback pipeline
  18. if (null != mPlaybackPipe) {
  19. mPlaybackPipe.close();
  20. mPlaybackPipe = null;
  21. }
  22. // Create Playback Pipeline
  23. mPlaybackPipe = new Pipeline(mRecordFilePath);
  24. // Get the Playback from Pipeline
  25. mPlayback = mPlaybackPipe.getPlayback();
  26. // Set playback status callback
  27. mPlayback.setMediaStateCallback(mMediaStateCallback);
  28. // start playback
  29. mPlaybackPipe.start(null);
  30. // Create a playback thread
  31. if (null == mPlaybackThread) {
  32. mPlaybackThread = new Thread(mPlaybackRunnable);
  33. mPlaybackThread.start();
  34. }
  35. }
  36. } catch (Exception e) {
  37. e.printStackTrace();
  38. }
  39. }
  40. /**
  41. * stop playback
  42. */
  43. private void stopPlayback() {
  44. try {
  45. if (mIsPlaying) {
  46. mIsPlaying = false;
  47. // stop playback thread
  48. if (null != mPlaybackThread) {
  49. try {
  50. mPlaybackThread.join(300);
  51. } catch (InterruptedException e) {
  52. }
  53. mPlaybackThread = null;
  54. }
  55. // stop playback
  56. if (null != mPlaybackPipe) {
  57. mPlaybackPipe.stop();
  58. }
  59. }
  60. } catch (Exception e) {
  61. e.printStackTrace();
  62. }
  63. }
获取录制的图像数据
  1. // playback thread
  2. private Runnable mPlaybackRunnable = () -> {
  3. while (mIsPlaying) {
  4. try {
  5. // If it cannot be obtained after waiting for 100ms, it will time out
  6. FrameSet frameSet = mPlaybackPipe.waitForFrameSet(100);
  7. if (null == frameSet) {
  8. continue;
  9. }
  10. // Get depth stream data
  11. DepthFrame depthFrame = frameSet.getDepthFrame();
  12. if (null != depthFrame) {
  13. // Get depth data and render
  14. byte[] frameData = new byte[depthFrame.getDataSize()];
  15. depthFrame.getData(frameData);
  16. synchronized (mSync) {
  17. mDepthGLView.update(depthFrame.getWidth(), depthFrame.getHeight(), StreamType.DEPTH, depthFrame.getFormat(), frameData, depthFrame.getValueScale());
  18. }
  19. // Release the depth data frame
  20. depthFrame.close();
  21. }
  22. // Release FrameSet
  23. frameSet.close();
  24. } catch (Exception e) {
  25. e.printStackTrace();
  26. }
  27. }
  28. };
释放资源
  1. try {
  2. // Stop getting Pipeline data
  3. stop();
  4. // stop playback thread
  5. if (mIsPlaying) {
  6. mIsPlaying = false;
  7. if (null != mPlaybackThread) {
  8. try {
  9. mPlaybackThread.join(300);
  10. } catch (InterruptedException e) {
  11. }
  12. mPlaybackThread = null;
  13. }
  14. }
  15. // release playback
  16. if (null != mPlayback) {
  17. mPlayback.close();
  18. mPlayback = null;
  19. }