环信
环信
  • 发布:2023-06-20 19:16
  • 更新:2024-06-24 11:47
  • 阅读:361

【报Bug】uni. createInnerAudioContext构建音频播放实例,返回{"errMsg":"MediaError","errCode":-5}

分类:uni-app

产品分类: uniapp/App

PC开发环境操作系统: Mac

PC开发环境操作系统版本号: 版本13.1 (22C65)

HBuilderX类型: 正式

HBuilderX版本号: 3.8.4

手机系统: iOS

手机系统版本号: iOS 16

手机厂商: 苹果

手机机型: iPhone 14 Pro

页面类型: vue

vue版本: vue3

打包方式: 云端

项目创建方式: HBuilderX

App下载地址或H5⽹址: https://github.com/HuangFeiPeng/webim-uniapp-vue3-demo/tree/secendProject

示例代码:

客户端发送代码示例:

<template>  
  <view  
    class="record_container"  
    v-if="audioState.recordStatus != RecordStatus.HIDE"  
  >  
    <view class="modal modal-record" @tap="toggleRecordModal">  
      <view class="modal-body" @tap.stop>  
        <view class="sound-waves">  
          <view  
            v-for="(item, index) in audioState.radomHeight"  
            :key="index"  
            :style="'height:' + item + 'rpx;margin-top:-' + item / 2 + 'rpx'"  
          ></view>  
          <view style="clear: both; width: 0; height: 0"></view>  
        </view>  
        <text class="desc">{{ RecordDesc[audioState.recordStatus] }}</text>  
        <view  
          class="dot"  
          @touchstart="handleRecording"  
          @touchmove="handleRecordingMove"  
          @touchend="handleRecordingCancel"  
        >  
          <image class="icon-mic" src="/static/images/send.png"></image>  
        </view>  
      </view>  
    </view>  
    <view  
      class="mask"  
      v-if="audioState.recordStatus != RecordStatus.HIDE"  
    ></view>  
  </view>  
</template>  

<script setup>  
import { reactive, onUnmounted, inject } from 'vue';  
/* EaseIM */  
import { EMClient } from '@/EaseIM';  
import { emMessages } from '@/EaseIM/imApis';  
/* inject */  
const injectTargetId = inject('targetId');  
const injectChatType = inject('chatType');  
import { RecordStatus, RecordDesc } from './record_status';  
let RunAnimation = false;  
let recordTimeInterval = null;  
let waveTimer = null;  
const InitHeight = [  
  50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50,  
  50, 50, 50,  
];  

const audioState = reactive({  
  changedTouches: null,  
  recordStatus: RecordStatus.HIDE,  
  radomHeight: InitHeight,  
  recorderManager: null,  
  recordClicked: false,  
  isLongPress: false,  
  recordTime: 0,  
  rec: null, // h5 audio record  
});  

const sysInfo = uni.getSystemInfoSync();  
if (sysInfo.uniPlatform !== 'web') {  
  audioState.recorderManager = uni.getRecorderManager();  
}  
const toggleRecordModal = () => {  
  if (audioState.recordStatus === RecordStatus.HIDE) {  
    audioState.recordStatus = RecordStatus.SHOW;  
  } else {  
    audioState.recordStatus = RecordStatus.HIDE;  
  }  
  audioState.radomHeight = InitHeight;  
};  

const handleRecordingMove = (e) => {  
  const touches = e.touches[0];  
  const changedTouches = audioState.changedTouches;  

  if (!changedTouches) {  
    return;  
  }  

  if (audioState.recordStatus == RecordStatus.SWIPE) {  
    if (changedTouches.pageY - touches.pageY < 20) {  
      audioState.recordStatus = RecordStatus.HOLD;  
    }  
  }  

  if (audioState.recordStatus == RecordStatus.HOLD) {  
    if (changedTouches.pageY - touches.pageY > 20) {  
      audioState.recordStatus = RecordStatus.SWIPE;  
    }  
  }  
};  
// 初始化开始录音状态  
const initStartRecord = (e) => {  
  clearInterval(recordTimeInterval);  
  audioState.recordTime = 0;  
  audioState.changedTouches = e.touches[0];  
  audioState.recordStatus = RecordStatus.HOLD;  
  RunAnimation = true;  
  startWave();  
};  
// 记录录音时长  
const saveRecordTime = () => {  
  recordTimeInterval = setInterval(() => {  
    audioState.recordTime++;  
    if (audioState.recordTime === 100) {  
      handleRecordingCancel();  
      RunAnimation = false;  
    }  
  }, 1000);  
};  
const startRecord = (e) => {  
  initStartRecord(e);  
  let recorderManager = audioState.recorderManager || uni.getRecorderManager();  
  recorderManager.onStart(() => {  
    saveRecordTime();  
  });  
  recorderManager.start({  
    format: 'mp3',  
  });  
};  
const executeRecord = (e) => {  
  if (uni.getSetting) {  
    uni.getSetting({  
      success: (res) => {  
        clearInterval(recordTimeInterval);  
        audioState.recordTime = 0;  
        let recordAuth = res.authSetting['scope.record'];  
        if (recordAuth == false) {  
          // 已申请过授权,但是用户拒绝  
          uni.openSetting({  
            success: function (res) {  
              let recordAuth = res.authSetting['scope.record'];  
              if (recordAuth == true) {  
                uni.showToast({  
                  title: '授权成功',  
                  icon: 'success',  
                });  
              } else {  
                uni.showToast({  
                  title: '请授权录音',  
                  icon: 'none',  
                });  
              }  

              audioState.isLongPress = false;  
            },  
          });  
        } else if (recordAuth == true) {  
          // 用户已经同意授权  
          startRecord(e);  
        } else {  
          // 第一次进来,未发起授权  
          uni.authorize({  
            scope: 'scope.record',  
            success: () => {  
              // 授权成功  
              uni.showToast({  
                title: '授权成功',  
                icon: 'success',  
              });  
            },  
          });  
        }  
      },  
      fail: function () {  
        uni.showToast({  
          title: '鉴权失败,请重试',  
          icon: 'none',  
        });  
      },  
    });  
    return;  
  } else {  
    startRecord(e);  
    return;  
  }  
};  
const handleRecording = async (e) => {  
  const sysInfo = uni.getSystemInfoSync();  
  console.log('getSystemInfoSync', sysInfo);  
  if (sysInfo.app === 'alipay') {  
    // https://forum.alipay.com/mini-app/post/7301031?ant_source=opendoc_recommend  
    uni.showModal({  
      content: '支付宝小程序不支持语音消息,请查看支付宝相关api了解详情',  
    });  
    return;  
  }  
  audioState.recordClicked = true;  
  // h5不支持uni.getRecorderManager, 需要单独处理  
  if (sysInfo.uniPlatform === 'web') {  
    // console.log('>>>>>>进入了web层面注册页面');  
    // #ifdef H5  
    await import('@/recorderCore/src/recorder-core');  
    await import('@/recorderCore/src/engine/mp3');  
    await import('@/recorderCore/src/engine/mp3-engine');  
    if (audioState.recordClicked == true) {  
      clearInterval(recordTimeInterval);  
      initStartRecord(e);  
      audioState.rec = new window.Recorder({  
        type: 'mp3',  
      });  
      audioState.rec.open(  
        () => {  
          saveRecordTime();  
          audioState.rec.start();  
        },  
        (msg, isUserNotAllow) => {  
          if (isUserNotAllow) {  
            uni.showToast({  
              title: '鉴权失败,请重试',  
              icon: 'none',  
            });  
          } else {  
            uni.showToast({  
              title: `开启失败,请重试`,  
              icon: 'none',  
            });  
          }  
        }  
      );  
    }  
    // #endif  
  } else {  
    setTimeout(() => {  
      if (audioState.recordClicked == true) {  
        executeRecord(e);  
      }  
    }, 350);  
  }  
};  
// 取消录音  
const handleRecordingCancel = () => {  
  RunAnimation = false;  
  let recorderManager = audioState.recorderManager; // 向上滑动状态停止:取消录音发放  
  if (audioState.recordStatus == RecordStatus.SWIPE) {  
    audioState.recordStatus = RecordStatus.RELEASE;  
  } else {  
    audioState.recordStatus = RecordStatus.HIDE;  
    audioState.recordClicked = false;  
  }  
  if (uni.getSystemInfoSync().uniPlatform === 'web') {  
    audioState.rec.stop(  
      function (blob) {  
        clearInterval(recordTimeInterval);  
        let duration = audioState.recordTime * 1000;  
        if (audioState.recordStatus == RecordStatus.RELEASE) {  
          console.log('user canceled');  
          audioState.recordStatus = RecordStatus.HIDE;  
          return;  
        }  
        if (duration <= 1000) {  
          uni.showToast({  
            title: '录音时间太短',  
            icon: 'none',  
          });  
        } else {  
          let blobURL = window.URL.createObjectURL(blob);  
          uploadRecord(blobURL, duration);  
        }  
        audioState.recordStatus = RecordStatus.HIDE;  
        audioState.recordTime = 0;  
      },  
      function (s) {  
        console.log('结束出错:' + s, 1);  
      },  
      true  
    );  
  } else {  
    recorderManager.onStop((res) => {  
      clearInterval(recordTimeInterval);  
      let duration = audioState.recordTime * 1000;  
      if (audioState.recordStatus == RecordStatus.RELEASE) {  
        console.log('user canceled');  
        audioState.recordStatus = RecordStatus.HIDE;  
        return;  
      }  

      if (duration <= 1000) {  
        uni.showToast({  
          title: '录音时间太短',  
          icon: 'none',  
        });  
      } else {  
        // 上传  
        uploadRecord(res.tempFilePath, duration);  
      }  
      clearInterval(recordTimeInterval);  
      audioState.recordStatus = RecordStatus.HIDE;  
      audioState.recordTime = 0;  
    }); // 停止录音  

    recorderManager.stop();  
  }  
};  
//发送录音消息  
const { sendDisplayMessages } = emMessages();  
const sendAudioMessage = async (res, durations) => {  
  const dataObj = JSON.parse(res.data); // 接收消息对象  
  const bodys = {  
    url: dataObj.uri + '/' + dataObj.entities[0].uuid,  
    filename: `${new Date().getTime()}.mp3`,  
    filetype: 'mp3',  
    length: Math.ceil(durations / 1000),  
  };  
  const params = {  
    // 消息类型。  
    type: 'audio',  
    body: { ...bodys },  
    filename: bodys.filename,  
    // 消息接收方:单聊为对方用户 ID,群聊和聊天室分别为群组 ID 和聊天室 ID。  
    to: injectTargetId.value,  
    // 会话类型:单聊、群聊和聊天室分别为 `singleChat`、`groupChat` 和 `chatRoom`。  
    chatType: injectChatType.value,  
  };  
  try {  
    const res = await sendDisplayMessages({ ...params });  
    console.log('>>>>>>已发送', res);  
  } catch (error) {  
    console.log('>>>>>语音消息发送失败', error);  
    uni.showToast({  
      title: '消息发送失败',  
      icon: 'none',  
    });  
  }  
};  
//上传录音附件资源至环信服务器  
const uploadRecord = async (tempFilePath, durations) => {  
  if (!tempFilePath) return;  
  const apiUrl = EMClient.apiUrl;  
  const orgName = EMClient.orgName;  
  const appName = EMClient.appName;  
  const uploadTargetUrl = `${apiUrl}/${orgName}/${appName}/chatfiles`;  
  const accessToken = EMClient.token;  
  const requestParams = {  
    url: uploadTargetUrl,  
    filePath: tempFilePath,  
    fileType: 'audio',  
    name: 'file',  
    header: {  
      Authorization: 'Bearer ' + accessToken,  
    },  
    success: (res) => {  
      console.log('>>>>>录音上传成功', res);  
      uni.showToast({ title: '音源已上传...', icon: 'none' });  
      sendAudioMessage(res, durations);  
    },  
    fail: (e) => {  
      console.log('>>>>>上传失败', e);  
      uni.showToast({ title: '录音上传失败', icon: 'none' });  
    },  
  };  
  uni.uploadFile(requestParams);  
};  

// 波纹动画  
const startWave = () => {  
  var _radomHeight = [...audioState.radomHeight];  
  for (var i = 0; i < audioState.radomHeight.length; i++) {  
    //+1是为了避免为0  
    _radomHeight[i] = 100 * Math.random().toFixed(2) + 10;  
  }  
  audioState.radomHeight = _radomHeight;  
  if (RunAnimation) {  
    waveTimer = setTimeout(function () {  
      startWave();  
    }, 500);  
  } else {  
    clearInterval(waveTimer);  
    return;  
  }  
};  
onUnmounted(() => {  
  clearInterval(recordTimeInterval);  
  clearTimeout(waveTimer);  
  audioState.recordTime = 0;  
});  

defineExpose({  
  toggleRecordModal,  
});  
</script>  
<style>  
@import './index.css';  
</style>

客户端实现播放示例:

<template>  
  <view  
    class="audio-player"  
    @tap="playAudioMessage"  
    :style="'opacity: ' + audioState.opcity"  
  >  
    <text class="time"  
      >语音消息  
      {{  
        msg.length  
          ? msg.length + '′′'  
          : msg.body.length  
          ? msg.body.length + '′′'  
          : ''  
      }}</text  
    >  
    <view class="controls play-btn">  
      <image  
        :src="  
          isSelf(msg)  
            ? '../../../../../static/images/voicemsgmy.png'  
            : '../../../../../static/images/voicemsg.png'  
        "  
      ></image>  
    </view>  
  </view>  
</template>  

<script setup>  
import { reactive, toRefs, computed, onBeforeUnmount } from 'vue';  
/* stores */  
import { useLoginStore } from '@/stores/login';  
/* props */  
const props = defineProps({  
  msg: {  
    type: Object,  
    val: {},  
  },  
});  
const { msg } = toRefs(props);  
console.log('.>>>>>>>>传递过来的语音消息数据', msg.value);  
const loginStore = useLoginStore();  
//判消息来源是否为自己  
const isSelf = computed(() => {  
  return (item) => {  
    return item.from === loginStore.loginUserBaseInfos.loginUserId;  
  };  
});  
const audioState = reactive({  
  opcity: 1,  
  style: '',  
});  
let playAnimation = null;  
const innerAudioContext = uni.createInnerAudioContext({  
  obeyMuteSwitch: false,  
});  
innerAudioContext.onPlay(() => {  
  console.log('>>>>>音频播放事件触发');  
  playAnimation && clearInterval(playAnimation);  
  playAnimation = setInterval(() => {  
    let opcity = audioState.opcity;  
    audioState.opcity = opcity == 1 ? 0.4 : 1;  
  }, 500);  
});  
innerAudioContext.onEnded(() => {  
  console.log('>>>音频播放结束');  
  playAnimation && clearInterval(playAnimation);  
  audioState.opcity = 1;  
});  
innerAudioContext.onError((res) => {  
  console.log(res.errMsg);  
  console.log(res.errCode);  
  uni.showToast({ title: '播放失败', icon: 'none' });  
});  
const formatAudioToMp3 = () => {  
  uni.downloadFile({  
    url: msg.value?.url ? msg.value.url : msg.value?.body?.url,  
    header: {  
      'X-Requested-With': 'XMLHttpRequest',  
      Accept: 'audio/mp3',  
      Authorization: 'Bearer ' + msg.value.accessToken,  
    },  

    success(res) {  
        console.log(res)  
      const tempFilePath = res.tempFilePath;  
      console.log('>>>>>>音频下载完成', tempFilePath);  
      innerAudioContext.src = tempFilePath;  
      if (innerAudioContext?.src) {  
        innerAudioContext.play();  
      }  
    },  

    fail(e) {  
      console.log('downloadFile failed', e);  
      uni.showToast({  
        title: '下载失败',  
        duration: 1000,  
      });  
    },  
  });  
};  
const playAudioMessage = () => {  
  formatAudioToMp3();  
};  

onBeforeUnmount(() => {  
  //离开页面卸载音频播放实例  
  innerAudioContext.destroy();  
});  
</script>  
<style>  
@import './audio.css';  
</style>  

操作步骤:

A用户通过H5端使用recorder-core.js 采集音频源并进行上传。
B用户收到A用户发送的音频源静态资源地址,调用uni.downloadFile进行转为mp3之后下载,并利用const innerAudioContext = uni.createInnerAudioContext({
obeyMuteSwitch: false,
});
innerAudioContext.play()进行播放。
即可触发onError进行复现
测试账号索取请添加QQ:1094521084 或微信号WanTingIsBest 我可以配合给到。

预期结果:

IOS端播放H5端发送的音频源正常播放。

实际结果:

实际结果播放
返回{"errMsg":"MediaError","errCode":-5}

bug描述:

问题简述:
此项目为IM 即时通讯类项目,发现使用H5采集到的音频源经下载之后,仅在uni-app运行到IOS的客户端上无法进行播放,触发的监听回调为onError,error详情为{"errMsg":"MediaError","errCode":-5}。
社区有相似问题上报,参考链接为:https://ask.dcloud.net.cn/article/39258

验证结果:经验证音频源采集并无问题,且在非IOS端播放正常吗,因此初步判断IOS底层API处理应该有相关异常。
相同逻辑代码测试范围:
安卓对H5发送音频消息,下载播放:正常
安卓对安卓发送音频消息,下载播放:正常
安卓对IOS两者互相发送播放,下载播放:正常
H5对安卓发送音频消息,下载播放:正常
H5对IOS发送音频消息,下载播放:报错。

H5发送至IOS播放异常时控制台输出:
19:06:20.085 >>>>>>音频下载完成, _doc/uniapp_temp_1687257412594/download/file-1687258810471 at components/emChat/messageList/type/audio/audio.vue:87
19:06:20.085 >>>>>音频播放事件触发 at components/emChat/messageList/type/audio/audio.vue:58
19:06:20.091 MediaError at components/emChat/messageList/type/audio/audio.vue:71
19:06:20.096 [Number] -5 at components/emChat/messageList/type/audio/audio.vue:72

其他端发送至IOS音频源播放输出。
19:06:20.085 >>>>>>音频下载完成, _doc/uniapp_temp_1687257412594/download/file-1687258810471 at components/emChat/messageList/type/audio/audio.vue:87
19:06:20.085 >>>>>音频播放事件触发 at components/emChat/messageList/type/audio/audio.vue:58
19:06:20.091 MediaError at components/emChat/messageList/type/audio/audio.vue:71
19:06:20.096 [Number] -5 at components/emChat/messageList/type/audio/audio.vue:72

2023-06-20 19:16 负责人:DCloud_iOS_WZT 分享
已邀请:
环信

环信 (作者) - 环信官方账号

此问题同时发现也有一个2021年的相同反馈,https://ask.dcloud.net.cn/article/39258

1***@qq.com

1***@qq.com - 啦啦啦啦

后续有解决方案吗?ios播放网络音频报错-5

要回复问题请先登录注册