精华内容
下载资源
问答
  • 怎样实现点击开始按钮开始录音同时滑动条滑动,滑动条后面的文本框显示录音时长,点击结束按钮录音结束并保存
  • c++实现waveinopen录音功能

    万次阅读 多人点赞 2016-12-13 16:42:10
    本周的第一篇博客,自己想讲讲刚学习的新知识...在Windows中提供了相应的API函数(waveIn这个族的函数)实现录音功能(具体的播放功能是使用waveOut族的函数) 在使用这些函数是,一定要引入相应的头文件: #include #i

    本周的第一篇博客,自己想讲讲刚学习的新知识,Windows下怎样录制音频。因为自己的需要写一个关于网络电话的程序。这个程序毫无疑问必须用到录制音频和播放音频这样的基本功能。首先先说说怎样录制音频:

    在Windows中提供了相应的API函数(waveIn这个族的函数)实现录音功能(具体的播放功能是使用waveOut族的函数)

    在使用这些函数是,一定要引入相应的头文件:

    #include<windows.h>

    #include <MMSYSTEM.H>
    #pragma comment(lib, "WINMM.LIB")

    1)、在开始录音之前,需要首先定义音频流的相关信息:使用WAVEFORMATEX结构体,设置相关的音频流信息。以下是MSDN中的定义:

    typedef struct
    {
    WORD wFormatTag;//表示:波形音频的格式,一般的情况下设置为WAVE_FORMAT_PCM
    WORD nChannels;//表示:音频声道的数量。可以是1或者2(现在电脑基本上都是左右两个声道,因此一般设置为2)
    DWORD nSamplesPerSec;//表示:每个声道播放和接收的音频的样本频率(一般的频率为8khz,11.025khz,22.05khz,41.1khz)
    DWORD nAvgBytesPerSec;//表示:平均的数据传输率,单位为byte/s
    WORD nBlockAlign;//表示:以字节为单位的块对齐的大小,一般为: (nChannels*wBitsPerSample)/8
    WORD wBitsPerSample;//表示:根据wFormatTag设置的类型,设置采样率的大小,如果设置为WAVE_FORMAT_PCM,则大小为8的整数倍
    WORD cbSize;//表示:额外的空间,一般不需要,设置为0
    } WAVEFORMATEX; *PWAVEFORMATEX;

    定义一个WAVEFORMATEX对象,根据自己的要求设置音频流的信息,如下:

    BOOL CIP_PHONEDlg::initAudioDevice()
    {
    	try
    	{
    		waveForm.nSamplesPerSec = 44100; /* sample rate */
    		waveForm.wBitsPerSample = 16; /* sample size */
    		waveForm.nChannels= 2; /* channels*/
    		waveForm.cbSize = 0; /* size of _extra_ info */
    		waveForm.wFormatTag = WAVE_FORMAT_PCM;
    		waveForm.nBlockAlign = (waveForm.wBitsPerSample * waveForm.nChannels) >> 3;
    		waveForm.nAvgBytesPerSec = waveForm.nBlockAlign * waveForm.nSamplesPerSec;
    		return TRUE;
    	}
    	catch(...)
    	{
    		return FALSE;
    	}
    }
    2)、当音频流信息设置完成后,接下来需要启动录音设备:使用waveInOpen函数

    该函数在使用时,需要一个设备句柄,以后就是在这个设备上录音的。定义一个设备对象:HWAVEIN hWaveIn;//输入设备

    MMRESULT mRet=waveInOpen(&hWaveIn, WAVE_MAPPER, &waveForm,(DWORD)MicCallBack, (DWORD)this, CALLBACK_FUNCTION);
    if (mRet != MMSYSERR_NOERROR)
    {
    	return FALSE;
    }
    waveInOpen函数的第一个参数表示:一个特定的录音设备指针,如果设备启动成功,该参数的值将会被赋值为启动的设备

    第二参数表示:需要启动的设备ID。一般不会手动的指定某个设备,而是通过设置WAVE_MAPPER,通过系统查找可用的设备

    第三个参数表示:音频流信息对象的指针。这个参数就是我们第一步设置的对象

    第四个参数表示:录音消息的处理程序,可以设置为一个函数、或者事件句柄、窗口句柄、一个特定的线程。也就是说录音消息产生后,由这个参数对应的值来处理该消息。包括关闭录音、缓冲区已满、开启设备。

    第5个参数表示:第四个参数的参数列表

    第6个参数表示:打开设备的标示符。对应第四个参数,如果第四个参数设置为函数,则第6个参数的值为CALLBACK_FUNCTION;如果是事件,则为CALLBACK_EVENT;如果为窗体句柄(第5个参数设置为0),则为CALLBACK_WINDOW;如过设置为0,则为CALLBACK_NULL;如果为线程,则为CALLBACK_THREAD

    表示的意思就是当录音产生消息后,由谁来处理相应的消息。

    注意:要想该函数成功执行,必须在开始之前,有录音设备的存在(台式电脑一定要插入麦克风才可以被检测到)

    3)、当录音设备启动后,接下来需要声明两个缓冲区和两个缓冲区头部结构体WAVEHDR对象,缓冲区用来存放录音音频,并用缓冲区初始化头部对象

    	pBuffer1 = new BYTE[bufsize];
    	if (pBuffer1 == NULL)
    	{
    		return FALSE;
    	}
    	memset(pBuffer1,0,bufsize);
    	Whdr1.lpData = (LPSTR)pBuffer1;
    	Whdr1.dwBufferLength = bufsize;
    	Whdr1.dwBytesRecorded = 0;
    	Whdr1.dwUser = 0;
    	Whdr1.dwFlags = 0;
    	Whdr1.dwLoops = 1;
    
    	pBuffer2 = new BYTE[bufsize];
    	if (pBuffer2 == NULL)
    	{
    		return FALSE;
    	}
    	memset(pBuffer2,0,bufsize);
    	Whdr2.lpData = (LPSTR)pBuffer2;
    	Whdr2.dwBufferLength = bufsize;
    	Whdr2.dwBytesRecorded = 0;
    	Whdr2.dwUser = 0;
    	Whdr2.dwFlags = 0;
    	Whdr2.dwLoops = 1;

    WAVEHDR对象定义如下:

    typedef struct { 
        LPSTR      lpData;//缓冲区存放的内容
        DWORD      dwBufferLength; //缓冲区的大小
        DWORD      dwBytesRecorded; //缓冲区中存放的字节数
        DWORD_PTR  dwUser; //
        DWORD      dwFlags; //
        DWORD      dwLoops; //
        struct wavehdr_tag * lpNext;// 
        DWORD_PTR reserved; //
    } WAVEHDR;
    4)、接下来将这两个头部对象,加入到准备的录音缓冲区中。该过程使用waveInPrepareHeader函数。

    waveInPrepareHeader(hWaveIn, &Whdr1, sizeof(WAVEHDR));//准备一个波形数据块头用于录音
    waveInPrepareHeader(hWaveIn, &Whdr2, sizeof(WAVEHDR));//准备二个波形数据块头用于录音
    waveInPrePareHeader的第一个参数表示:录音设备句柄;第二个参数表示:录音的缓冲区对象;第三个参数表示:录音缓冲区结构体的大小。

    5)、当准备好录音缓冲区之后,就可以将录音缓冲区加入到指定的录音设备中。该步骤使用waveInAddBuffer函数

    waveInAddBuffer(hWaveIn, &Whdr1, sizeof (WAVEHDR));//指定波形数据块为录音输入缓存
    waveInAddBuffer(hWaveIn, &Whdr2, sizeof (WAVEHDR));//指定波形数据块为录音输入缓存
    分别将缓冲区1和2设置为录音缓冲区。这些缓冲区将会被加入到录音缓冲队列中,缓冲区循环执行。

    6)、开始录音,使用waveInStart函数

    waveInStart(hWaveIn);//开始录音

    这个函数的意思就是,通过hWaveIn录音设备,将波形音频放入录音缓冲区(前面已经指定了缓冲区)

    7)、当缓冲区满时,waveinstart函数,就会自动的调用waveInOpen函数中制定的函数/窗体/事件;通过该函数,用户可以将缓冲区的波形文件发送给其他的用户,也可以将缓冲区的文件保存,即就是用户对缓冲区的拷贝。声卡自动将音频缓冲区从缓冲队列中删除。拷贝完成后,就将该缓冲区以及对应的音频头文件初始化,并通过waveInAddBuffer函数重新加入录音缓冲队列中。

    DWORD CIP_PHONEDlg::MicCallBack(HWAVEIN hWaveIn,UINT uMsg,DWORD dwInstance,DWORD dwParam1,DWORD dwParam2)
    {//所有的这些录音消息都是有录音函数自动触发的,开发者不需要自己触发
    	CIP_PHONEDlg* pwnd=(CIP_PHONEDlg*)dwInstance;//表示录音的窗体
    	PWAVEHDR whd=(PWAVEHDR)dwParam1; //录音的头结构体对象
    	switch(uMsg)
    	{
    	case WIM_OPEN://打开录音设备,这里不做处理
    		break;
    	case WIM_DATA://表示数据缓冲区已满,我们将信息写入一个pcm文件
    		//保存数据
    		pwnd->pf=fopen( pwnd->soundName, "ab+");//一定要以二进制数据写入,否则录制的音频会出现杂音
    		Sleep(1000);//等待声音录制1s
    		fwrite(whd->lpData, 1, whd->dwBufferLength, pwnd->pf);
    		if (pwnd->isGetSound)
    		{
    			waveInAddBuffer(hWaveIn,whd,sizeof(WAVEHDR));
    		}
    		fclose(pwnd->pf);
    		break;
    	case WIM_CLOSE://停止录音
    		waveInStop(hWaveIn);
    		waveInReset(hWaveIn);	
    		waveInClose(hWaveIn);
    		break;
    	default:
    		break;
    	}
    	return 0;
    }
    8)、停止录音,使用waveInClose函数执行该操作

    delete []pBuffer1;
    delete []pBuffer2;//将开辟的缓冲区空间释放
    waveInClose(hWaveIn);//停止录音
    停止录音时,将会触发WIM_CLOSE消息。

    在这个过程中首先执行waveInStop函数:表示禁止向输入缓冲区中输入波形数据;

    然后执行waveInReset函数:表示停止波形数据的输入并且将当前的位置为0,将所有挂起的输入缓冲区设置为完成,并返回给应用程序(其实就是一个复位操作)

    最后执行waveInClose函数:表示关闭录音设备

    通过上面8步运行,可以正常的录入一段音频了。

    详细的代码:

    类的定义:

    #include <MMSYSTEM.H>
    #pragma comment(lib, "WINMM.LIB")
    
    class CIP_PHONEDlg : public CDialog
    {
    private://音频的采集
    	BOOL initAudioDevice();//初始话音频结构体
    	BOOL getSound();//采集声音
    	static DWORD CALLBACK MicCallBack(HWAVEIN hWaveIn,UINT uMsg,DWORD dwInstance,DWORD dwParam1,DWORD dwParam2);//采集声音的信息处理函数
    	void getTimeStr();//获得录音文件的名称
    public://采集声音时的变量
    	HWAVEIN hWaveIn;//输入设备
    	WAVEFORMATEX waveForm;//采集声音的格式,结构体
    	BYTE *pBuffer1,*pBuffer2;//采集声音的缓冲区
    	WAVEHDR Whdr1,Whdr2;//采集音频时数据缓冲的结构体
    	bool isGetSound;//是否采集声音
    	FILE *pf;//音频文件的句柄
    	CString soundName;//录音的音频文件名称
    };
    具体的代码实现
    BOOL CIP_PHONEDlg::initAudioDevice()
    {
    	try
    	{
    		waveForm.nSamplesPerSec = 44100; /* sample rate */
    		waveForm.wBitsPerSample = 16; /* sample size */
    		waveForm.nChannels= 2; /* channels*/
    		waveForm.cbSize = 0; /* size of _extra_ info */
    		waveForm.wFormatTag = WAVE_FORMAT_PCM;
    		waveForm.nBlockAlign = (waveForm.wBitsPerSample * waveForm.nChannels) >> 3;
    		waveForm.nAvgBytesPerSec = waveForm.nBlockAlign * waveForm.nSamplesPerSec;
    		return TRUE;
    	}
    	catch(...)
    	{
    		return FALSE;
    	}
    }
    void CIP_PHONEDlg::getTimeStr()
    {//获得录音文件的名称
    	try
    	{
    		SYSTEMTIME times;
    		::GetSystemTime(×);
    		CString sTimeStr="";
    		sTimeStr.Format("%d_%d_%d_%d_%d_%d_%d",times.wYear,times.wMonth,times.wDay,times.wHour,times.wMinute,times.wSecond,times.wMilliseconds);
    		this->soundName="./RecordSounds\\"+sTimeStr+".pcm";//获得音频的存放路径
    	}
    	catch(...)
    	{
    		this->soundName="./RecordSounds\\出错文件.pcm";//获得音频的存放路径
    	}
    	
    	return ;
    }
    BOOL CIP_PHONEDlg::getSound()
    {
    	if (!initAudioDevice())
    	{
    		return FALSE;
    	}
    	MMRESULT mRet=waveInOpen(&hWaveIn, WAVE_MAPPER, &waveForm,(DWORD)MicCallBack, (DWORD)this, CALLBACK_FUNCTION);
    	if (mRet != MMSYSERR_NOERROR)
    	{
    		return FALSE;
    	}
    	getTimeStr();//生成录音文件的名称
    	pBuffer1 = new BYTE[bufsize];
    	if (pBuffer1 == NULL)
    	{
    		return FALSE;
    	}
    	memset(pBuffer1,0,bufsize);
    	Whdr1.lpData = (LPSTR)pBuffer1;
    	Whdr1.dwBufferLength = bufsize;
    	Whdr1.dwBytesRecorded = 0;
    	Whdr1.dwUser = 0;
    	Whdr1.dwFlags = 0;
    	Whdr1.dwLoops = 1;
    
    	pBuffer2 = new BYTE[bufsize];
    	if (pBuffer2 == NULL)
    	{
    		return FALSE;
    	}
    	memset(pBuffer2,0,bufsize);
    	Whdr2.lpData = (LPSTR)pBuffer2;
    	Whdr2.dwBufferLength = bufsize;
    	Whdr2.dwBytesRecorded = 0;
    	Whdr2.dwUser = 0;
    	Whdr2.dwFlags = 0;
    	Whdr2.dwLoops = 1;
    	waveInPrepareHeader(hWaveIn, &Whdr1, sizeof(WAVEHDR));//准备一个波形数据块头用于录音
    	waveInPrepareHeader(hWaveIn, &Whdr2, sizeof(WAVEHDR));//准备二个波形数据块头用于录音
    	waveInAddBuffer(hWaveIn, &Whdr1, sizeof (WAVEHDR));//指定波形数据块为录音输入缓存
    	waveInAddBuffer(hWaveIn, &Whdr2, sizeof (WAVEHDR));//指定波形数据块为录音输入缓存
    	waveInStart(hWaveIn);//开始录音
    
    	delete []pBuffer1;
    	delete []pBuffer2;
    	waveInClose(hWaveIn);//停止录音
    	return TRUE;
    }
    
    DWORD CIP_PHONEDlg::MicCallBack(HWAVEIN hWaveIn,UINT uMsg,DWORD dwInstance,DWORD dwParam1,DWORD dwParam2)
    {
    	CIP_PHONEDlg* pwnd=(CIP_PHONEDlg*)dwInstance;
    	PWAVEHDR whd=(PWAVEHDR)dwParam1; 
    	switch(uMsg)
    	{
    	case WIM_OPEN:
    		break;
    	case WIM_DATA:
    		//保存数据
    		pwnd->pf=fopen( pwnd->soundName, "ab+");//一定要以二进制数据写入,否则录入的音频会有杂音
    		fwrite(whd->lpData, 1, whd->dwBufferLength, pwnd->pf);
    		if (pwnd->isGetSound)
    		{
    			waveInAddBuffer(hWaveIn,whd,sizeof(WAVEHDR));
    		}
    		fclose(pwnd->pf);
    		break;
    	case WIM_CLOSE:
    		waveInStop(hWaveIn);
    		waveInReset(hWaveIn);	
    		waveInClose(hWaveIn);
    		break;
    	default:
    		break;
    	}
    	return 0;
    }

     

    注意:在录音时,可能需要设置自己的电脑,使麦克风可以录音。

    好了关于录音的相关操作,今天就写在这里,具体的播放音频的操作,将在下一节讲解。




    展开全文
  • 录音转文字在我们的日常生活及办公中是很常见的,通常情况下我们都是使用录音笔进行录音的,传统的录音笔只有一个功能就是录音;那么,录好的音如何将其转换成文字呢?很多人为此感到头疼,下面小编将方法告诉大家,...

      录音转文字在我们的日常生活及办公中是很常见的,通常情况下我们都是使用录音笔进行录音的,传统的录音笔只有一个功能就是录音;那么,录好的音如何将其转换成文字呢?很多人为此感到头疼,下面小编将方法告诉大家,一起来看看吧!

      使用工具:迅捷文字转语音软件

      操作步骤:

      第一步:首先,在电脑上打开文字转语音工具;

      第二步:在跳转的页面将出现三个功能选项,点击选择录音转文字;

      第三步:在右边出现的工具栏中将需要进行转换的录音文件上传到指定区域,点击选择文件或选择文件夹,两个选择其中一个即可;

      第四步:需要继续添加文件,点击下方添加文件或添加文件夹即可;

      第五步:在上传的过程中出现错误需要将其进行删除的点击下方清空文本或点击文件后面的X即可;

      第六步:语音转换设置中可以对识别的语种及保存的路径根据自己的需要进行设置,设置成功之后点击确定即可;

      第七步:以上操作完成之后,点击开始转换,当转换列表中显示转换成功,我们就可以点击打开文件夹或者是回到自己保存的路径进行查看;

      以上就是给大家讲解的录音转文字简单方法介绍,步骤阐述的很详细,大家在转换的过程中遇到问题可以直接在留言框中进行留言哦,同时觉得好用的话可以分享给你的小伙伴哦!

    转载于:https://my.oschina.net/u/4000685/blog/3071120

    展开全文
  • 在Android中怎样实现类似发送语音消息功能前的按键录音并将其显示在RecyclerView上并且能点击录音文件进行播放。 注: 博客:https://blog.csdn.net/badao_liumang_qizhi 关注公众号 霸道的程序猿 获取编程...

    场景

    在Android中怎样实现类似发送语音消息功能前的按键录音并将其显示在RecyclerView上并且能点击录音文件进行播放。

     

    注:

    博客:
    https://blog.csdn.net/badao_liumang_qizhi
    关注公众号
    霸道的程序猿
    获取编程相关电子书、教程推送与免费下载。

    实现

    实现页面布局

    新建一个项目,实现页面布局如下

     

    布局代码:

    <?xml version="1.0" encoding="utf-8"?>
    <androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        xmlns:app="http://schemas.android.com/apk/res-auto">
    
    
        <androidx.recyclerview.widget.RecyclerView
            android:id="@+id/recycler"
            android:layout_width="match_parent"
            android:layout_height="wrap_content"
            app:layout_constraintBottom_toTopOf="@+id/layout_bottom"
            app:layout_constraintTop_toTopOf="parent" />
    
        <RelativeLayout
            android:layout_width="match_parent"
            android:layout_height="101dp"
            android:id="@+id/layout_bottom"
            android:background="#F7F7F7"
            app:layout_constraintBottom_toBottomOf="parent"
            app:layout_constraintStart_toStartOf="parent">
    
            <ImageView
                android:layout_width="66dp"
                android:layout_height="66dp"
                android:id="@+id/img_voice"
                android:background="@mipmap/badao"
                android:layout_centerInParent="true"
                />
    
        </RelativeLayout>
    
    </androidx.constraintlayout.widget.ConstraintLayout>

    并且在MainActivity中获取这些控件,并且设置下面ImageView的按下和松开的事件

    public class MainActivity extends AppCompatActivity {
    
        private ImageView audioImageView;
        private RecyclerView mRecyclerView;
    
    
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setContentView(R.layout.activity_main);
    
            audioImageView = findViewById(R.id.img_voice);
            audioImageView.setOnTouchListener(new View.OnTouchListener() {
                @Override
                public boolean onTouch(View view, MotionEvent motionEvent) {
                    if(motionEvent.getAction() == MotionEvent.ACTION_DOWN)
                    {
                        Toast.makeText(MainActivity.this,"录音开始",Toast.LENGTH_SHORT).show();
    
                        return true;
                    }else if(motionEvent.getAction() == MotionEvent.ACTION_UP)
                    {
                        Toast.makeText(MainActivity.this,"录音结束",Toast.LENGTH_SHORT).show();
    
                        return true;
                    }
                    return false;
                }
            });
    
        }

    赋予录音权限

    打开AndroidManifest.xml

        <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
        <uses-permission android:name="android.permission.RECORD_AUDIO"/>

    封装录音相关工具类

    在包下新建audioHelper包,包下新建如下几个接口和类

     

    AudioRecordManager

    package com.badao.audiodemo.audioHelper;
    
    import android.annotation.TargetApi;
    import android.content.Context;
    import android.content.res.Resources;
    import android.media.AudioManager;
    import android.media.MediaRecorder;
    import android.net.Uri;
    import android.os.Build;
    import android.os.Handler;
    import android.os.Message;
    import android.os.SystemClock;
    import android.telephony.PhoneStateListener;
    import android.telephony.TelephonyManager;
    import android.text.TextUtils;
    import android.util.Log;
    
    import java.io.File;
    
    public class AudioRecordManager implements Handler.Callback {
        private static final String TAG = "LQR_AudioRecordManager";
        private int RECORD_INTERVAL;
        private String SAVE_PATH;
        private IAudioState mCurAudioState;
        private Context mContext;
        private Handler mHandler;
        private AudioManager mAudioManager;
        private MediaRecorder mMediaRecorder;
        private Uri mAudioPath;
        private long smStartRecTime;
        private AudioManager.OnAudioFocusChangeListener mAfChangeListener;
        IAudioState idleState;
        IAudioState recordState;
        IAudioState sendingState;
        IAudioState cancelState;
        IAudioState timerState;
        private IAudioRecordListener mAudioRecordListener;
        public static AudioRecordManager mInstance;
    
        public static AudioRecordManager getInstance(Context context) {
            if (mInstance == null) {
                Class var1 = AudioRecordManager.class;
                synchronized(AudioRecordManager.class) {
                    if (mInstance == null) {
                        mInstance = new AudioRecordManager(context);
                    }
                }
            }
    
            return mInstance;
        }
    
        @TargetApi(21)
        private AudioRecordManager(Context context) {
            this.mContext = context;
            this.mHandler = new Handler(this);
            this.RECORD_INTERVAL = 60;
            this.idleState = new AudioRecordManager.IdleState();
            this.recordState = new AudioRecordManager.RecordState();
            this.sendingState = new AudioRecordManager.SendingState();
            this.cancelState = new AudioRecordManager.CancelState();
            this.timerState = new AudioRecordManager.TimerState();
            if (Build.VERSION.SDK_INT < 21) {
                try {
                    TelephonyManager e = (TelephonyManager)this.mContext.getSystemService("phone");
                    e.listen(new PhoneStateListener() {
                        public void onCallStateChanged(int state, String incomingNumber) {
                            switch(state) {
                                case 1:
                                    AudioRecordManager.this.sendEmptyMessage(6);
                                case 0:
                                case 2:
                                default:
                                    super.onCallStateChanged(state, incomingNumber);
                            }
                        }
                    }, 32);
                } catch (SecurityException var3) {
                    var3.printStackTrace();
                }
            }
    
            this.mCurAudioState = this.idleState;
            this.idleState.enter();
        }
    
        public final boolean handleMessage(Message msg) {
            Log.i("LQR_AudioRecordManager", "handleMessage " + msg.what);
            AudioStateMessage m;
            switch(msg.what) {
                case 2:
                    this.sendEmptyMessage(2);
                    break;
                case 7:
                    m = AudioStateMessage.obtain();
                    m.what = msg.what;
                    m.obj = msg.obj;
                    this.sendMessage(m);
                    break;
                case 8:
                    m = AudioStateMessage.obtain();
                    m.what = 7;
                    m.obj = msg.obj;
                    this.sendMessage(m);
            }
    
            return false;
        }
    
        private void initView() {
            if (this.mAudioRecordListener != null) {
                this.mAudioRecordListener.initTipView();
            }
    
        }
    
        private void setTimeoutView(int counter) {
            if (this.mAudioRecordListener != null) {
                this.mAudioRecordListener.setTimeoutTipView(counter);
            }
    
        }
    
        private void setRecordingView() {
            if (this.mAudioRecordListener != null) {
                this.mAudioRecordListener.setRecordingTipView();
            }
    
        }
    
        private void setCancelView() {
            if (this.mAudioRecordListener != null) {
                this.mAudioRecordListener.setCancelTipView();
            }
    
        }
    
        private void destroyView() {
            Log.d("LQR_AudioRecordManager", "destroyTipView");
            this.mHandler.removeMessages(7);
            this.mHandler.removeMessages(8);
            this.mHandler.removeMessages(2);
            if (this.mAudioRecordListener != null) {
                this.mAudioRecordListener.destroyTipView();
            }
    
        }
    
        public void setMaxVoiceDuration(int maxVoiceDuration) {
            this.RECORD_INTERVAL = maxVoiceDuration;
        }
    
        public void setAudioSavePath(String path) {
            if (TextUtils.isEmpty(path)) {
                this.SAVE_PATH = this.mContext.getCacheDir().getAbsolutePath();
            } else {
                this.SAVE_PATH = path;
            }
    
        }
    
        public int getMaxVoiceDuration() {
            return this.RECORD_INTERVAL;
        }
    
        public void startRecord() {
            this.mAudioManager = (AudioManager)this.mContext.getSystemService("audio");
            if (this.mAfChangeListener != null) {
                this.mAudioManager.abandonAudioFocus(this.mAfChangeListener);
                this.mAfChangeListener = null;
            }
    
            this.mAfChangeListener = new AudioManager.OnAudioFocusChangeListener() {
                public void onAudioFocusChange(int focusChange) {
                    Log.d("LQR_AudioRecordManager", "OnAudioFocusChangeListener " + focusChange);
                    if (focusChange == -1) {
                        AudioRecordManager.this.mAudioManager.abandonAudioFocus(AudioRecordManager.this.mAfChangeListener);
                        AudioRecordManager.this.mAfChangeListener = null;
                        AudioRecordManager.this.sendEmptyMessage(6);
                    }
    
                }
            };
            this.sendEmptyMessage(1);
            if (this.mAudioRecordListener != null) {
                this.mAudioRecordListener.onStartRecord();
            }
    
        }
    
        public void willCancelRecord() {
            this.sendEmptyMessage(3);
        }
    
        public void continueRecord() {
            this.sendEmptyMessage(4);
        }
    
        public void stopRecord() {
            this.sendEmptyMessage(5);
        }
    
        public void destroyRecord() {
            AudioStateMessage msg = new AudioStateMessage();
            msg.obj = true;
            msg.what = 5;
            this.sendMessage(msg);
        }
    
        void sendMessage(AudioStateMessage message) {
            this.mCurAudioState.handleMessage(message);
        }
    
        void sendEmptyMessage(int event) {
            AudioStateMessage message = AudioStateMessage.obtain();
            message.what = event;
            this.mCurAudioState.handleMessage(message);
        }
    
        private void startRec() {
            Log.d("LQR_AudioRecordManager", "startRec");
    
            try {
                this.muteAudioFocus(this.mAudioManager, true);
                this.mAudioManager.setMode(0);
                this.mMediaRecorder = new MediaRecorder();
    
                try {
                    int bps = 7950;
                    this.mMediaRecorder.setAudioSamplingRate(8000);
                    this.mMediaRecorder.setAudioEncodingBitRate(bps);
                } catch (Resources.NotFoundException var2) {
                    var2.printStackTrace();
                }
    
                this.mMediaRecorder.setAudioChannels(1);
                this.mMediaRecorder.setAudioSource(1);
                this.mMediaRecorder.setOutputFormat(3);
                this.mMediaRecorder.setAudioEncoder(1);
                this.mAudioPath = Uri.fromFile(new File(this.SAVE_PATH, System.currentTimeMillis() + "temp.voice"));
                this.mMediaRecorder.setOutputFile(this.mAudioPath.getPath());
                this.mMediaRecorder.prepare();
                this.mMediaRecorder.start();
                Message e1 = Message.obtain();
                e1.what = 7;
                e1.obj = 10;
                this.mHandler.sendMessageDelayed(e1, (long)(this.RECORD_INTERVAL * 1000 - 10000));
            } catch (Exception var3) {
                var3.printStackTrace();
            }
    
        }
    
        private boolean checkAudioTimeLength() {
            long delta = SystemClock.elapsedRealtime() - this.smStartRecTime;
            return delta < 1000L;
        }
    
        private void stopRec() {
            Log.d("LQR_AudioRecordManager", "stopRec");
    
            try {
                this.muteAudioFocus(this.mAudioManager, false);
                if (this.mMediaRecorder != null) {
                    this.mMediaRecorder.stop();
                    this.mMediaRecorder.release();
                    this.mMediaRecorder = null;
                }
            } catch (Exception var2) {
                var2.printStackTrace();
            }
    
        }
    
        private void deleteAudioFile() {
            Log.d("LQR_AudioRecordManager", "deleteAudioFile");
            if (this.mAudioPath != null) {
                File file = new File(this.mAudioPath.getPath());
                if (file.exists()) {
                    file.delete();
                }
            }
    
        }
    
        private void finishRecord() {
            Log.d("LQR_AudioRecordManager", "finishRecord path = " + this.mAudioPath);
            if (this.mAudioRecordListener != null) {
                int duration = (int)(SystemClock.elapsedRealtime() - this.smStartRecTime) / 1000;
                this.mAudioRecordListener.onFinish(this.mAudioPath, duration);
            }
    
        }
    
        private void audioDBChanged() {
            if (this.mMediaRecorder != null) {
                int db = this.mMediaRecorder.getMaxAmplitude() / 600;
                if (this.mAudioRecordListener != null) {
                    this.mAudioRecordListener.onAudioDBChanged(db);
                }
            }
    
        }
    
        private void muteAudioFocus(AudioManager audioManager, boolean bMute) {
            if (Build.VERSION.SDK_INT < 8) {
                Log.d("LQR_AudioRecordManager", "muteAudioFocus Android 2.1 and below can not stop music");
            } else if (bMute) {
                audioManager.requestAudioFocus(this.mAfChangeListener, 3, 2);
            } else {
                audioManager.abandonAudioFocus(this.mAfChangeListener);
                this.mAfChangeListener = null;
            }
    
        }
    
        public IAudioRecordListener getAudioRecordListener() {
            return this.mAudioRecordListener;
        }
    
        public void setAudioRecordListener(IAudioRecordListener audioRecordListener) {
            this.mAudioRecordListener = audioRecordListener;
        }
    
        class IdleState extends IAudioState {
            public IdleState() {
                Log.d("LQR_AudioRecordManager", "IdleState");
            }
    
            void enter() {
                super.enter();
                if (AudioRecordManager.this.mHandler != null) {
                    AudioRecordManager.this.mHandler.removeMessages(7);
                    AudioRecordManager.this.mHandler.removeMessages(8);
                    AudioRecordManager.this.mHandler.removeMessages(2);
                }
    
            }
    
            void handleMessage(AudioStateMessage msg) {
                Log.d("LQR_AudioRecordManager", "IdleState handleMessage : " + msg.what);
                switch(msg.what) {
                    case 1:
                        AudioRecordManager.this.initView();
                        AudioRecordManager.this.setRecordingView();
                        AudioRecordManager.this.startRec();
                        AudioRecordManager.this.smStartRecTime = SystemClock.elapsedRealtime();
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.recordState;
                        AudioRecordManager.this.sendEmptyMessage(2);
                    default:
                }
            }
        }
    
        class RecordState extends IAudioState {
            RecordState() {
            }
    
            void handleMessage(AudioStateMessage msg) {
                Log.d("LQR_AudioRecordManager", this.getClass().getSimpleName() + " handleMessage : " + msg.what);
                switch(msg.what) {
                    case 2:
                        AudioRecordManager.this.audioDBChanged();
                        AudioRecordManager.this.mHandler.sendEmptyMessageDelayed(2, 150L);
                        break;
                    case 3:
                        AudioRecordManager.this.setCancelView();
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.cancelState;
                    case 4:
                    default:
                        break;
                    case 5:
                        final boolean checked = AudioRecordManager.this.checkAudioTimeLength();
                        boolean activityFinished = false;
                        if (msg.obj != null) {
                            activityFinished = (Boolean)msg.obj;
                        }
    
                        if (checked && !activityFinished) {
                            if (AudioRecordManager.this.mAudioRecordListener != null) {
                                AudioRecordManager.this.mAudioRecordListener.setAudioShortTipView();
                            }
    
                            AudioRecordManager.this.mHandler.removeMessages(2);
                        }
    
                        if (!activityFinished && AudioRecordManager.this.mHandler != null) {
                            AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
                                public void run() {
                                    AudioStateMessage message = AudioStateMessage.obtain();
                                    message.what = 9;
                                    message.obj = !checked;
                                    AudioRecordManager.this.sendMessage(message);
                                }
                            }, 500L);
                            AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.sendingState;
                        } else {
                            AudioRecordManager.this.stopRec();
                            if (!checked && activityFinished) {
                                AudioRecordManager.this.finishRecord();
                            }
    
                            AudioRecordManager.this.destroyView();
                            AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
                        }
                        break;
                    case 6:
                        AudioRecordManager.this.stopRec();
                        AudioRecordManager.this.destroyView();
                        AudioRecordManager.this.deleteAudioFile();
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
                        AudioRecordManager.this.idleState.enter();
                        break;
                    case 7:
                        int counter = (Integer)msg.obj;
                        AudioRecordManager.this.setTimeoutView(counter);
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.timerState;
                        if (counter > 0) {
                            Message message = Message.obtain();
                            message.what = 8;
                            message.obj = counter - 1;
                            AudioRecordManager.this.mHandler.sendMessageDelayed(message, 1000L);
                        } else {
                            AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
                                public void run() {
                                    AudioRecordManager.this.stopRec();
                                    AudioRecordManager.this.finishRecord();
                                    AudioRecordManager.this.destroyView();
                                }
                            }, 500L);
                            AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
                        }
                }
    
            }
        }
    
        class SendingState extends IAudioState {
            SendingState() {
            }
    
            void handleMessage(AudioStateMessage message) {
                Log.d("LQR_AudioRecordManager", "SendingState handleMessage " + message.what);
                switch(message.what) {
                    case 9:
                        AudioRecordManager.this.stopRec();
                        if ((Boolean)message.obj) {
                            AudioRecordManager.this.finishRecord();
                        }
    
                        AudioRecordManager.this.destroyView();
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
                    default:
                }
            }
        }
    
        class CancelState extends IAudioState {
            CancelState() {
            }
    
            void handleMessage(AudioStateMessage msg) {
                Log.d("LQR_AudioRecordManager", this.getClass().getSimpleName() + " handleMessage : " + msg.what);
                switch(msg.what) {
                    case 1:
                    case 2:
                    case 3:
                    default:
                        break;
                    case 4:
                        AudioRecordManager.this.setRecordingView();
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.recordState;
                        AudioRecordManager.this.sendEmptyMessage(2);
                        break;
                    case 5:
                    case 6:
                        AudioRecordManager.this.stopRec();
                        AudioRecordManager.this.destroyView();
                        AudioRecordManager.this.deleteAudioFile();
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
                        AudioRecordManager.this.idleState.enter();
                        break;
                    case 7:
                        int counter = (Integer)msg.obj;
                        if (counter > 0) {
                            Message message = Message.obtain();
                            message.what = 8;
                            message.obj = counter - 1;
                            AudioRecordManager.this.mHandler.sendMessageDelayed(message, 1000L);
                        } else {
                            AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
                                public void run() {
                                    AudioRecordManager.this.stopRec();
                                    AudioRecordManager.this.finishRecord();
                                    AudioRecordManager.this.destroyView();
                                }
                            }, 500L);
                            AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
                            AudioRecordManager.this.idleState.enter();
                        }
                }
    
            }
        }
    
        class TimerState extends IAudioState {
            TimerState() {
            }
    
            void handleMessage(AudioStateMessage msg) {
                Log.d("LQR_AudioRecordManager", this.getClass().getSimpleName() + " handleMessage : " + msg.what);
                switch(msg.what) {
                    case 3:
                        AudioRecordManager.this.setCancelView();
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.cancelState;
                    case 4:
                    default:
                        break;
                    case 5:
                        AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
                            public void run() {
                                AudioRecordManager.this.stopRec();
                                AudioRecordManager.this.finishRecord();
                                AudioRecordManager.this.destroyView();
                            }
                        }, 500L);
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
                        AudioRecordManager.this.idleState.enter();
                        break;
                    case 6:
                        AudioRecordManager.this.stopRec();
                        AudioRecordManager.this.destroyView();
                        AudioRecordManager.this.deleteAudioFile();
                        AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
                        AudioRecordManager.this.idleState.enter();
                        break;
                    case 7:
                        int counter = (Integer)msg.obj;
                        if (counter > 0) {
                            Message message = Message.obtain();
                            message.what = 8;
                            message.obj = counter - 1;
                            AudioRecordManager.this.mHandler.sendMessageDelayed(message, 1000L);
                            AudioRecordManager.this.setTimeoutView(counter);
                        } else {
                            AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
                                public void run() {
                                    AudioRecordManager.this.stopRec();
                                    AudioRecordManager.this.finishRecord();
                                    AudioRecordManager.this.destroyView();
                                }
                            }, 500L);
                            AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
                        }
                }
    
            }
        }
    }

    AudioPlayManager

    package com.badao.audiodemo.audioHelper;
    
    import android.annotation.TargetApi;
    import android.content.Context;
    import android.hardware.Sensor;
    import android.hardware.SensorEvent;
    import android.hardware.SensorEventListener;
    import android.hardware.SensorManager;
    import android.media.AudioManager;
    import android.media.MediaPlayer;
    import android.media.AudioManager.OnAudioFocusChangeListener;
    import android.media.MediaPlayer.OnCompletionListener;
    import android.media.MediaPlayer.OnErrorListener;
    import android.media.MediaPlayer.OnPreparedListener;
    import android.media.MediaPlayer.OnSeekCompleteListener;
    import android.net.Uri;
    import android.os.PowerManager;
    import android.os.Build.VERSION;
    import android.os.PowerManager.WakeLock;
    import android.util.Log;
    import java.io.IOException;
    
    public class AudioPlayManager implements SensorEventListener {
        private static final String TAG = "LQR_AudioPlayManager";
        private MediaPlayer _mediaPlayer;
        private IAudioPlayListener _playListener;
        private Uri _playingUri;
        private Sensor _sensor;
        private SensorManager _sensorManager;
        private AudioManager _audioManager;
        private PowerManager _powerManager;
        private WakeLock _wakeLock;
        private OnAudioFocusChangeListener afChangeListener;
        private Context context;
    
        public AudioPlayManager() {
        }
    
        public static AudioPlayManager getInstance() {
            return AudioPlayManager.SingletonHolder.sInstance;
        }
    
        @TargetApi(11)
        public void onSensorChanged(SensorEvent event) {
            float range = event.values[0];
            if (this._sensor != null && this._mediaPlayer != null) {
                if (this._mediaPlayer.isPlaying()) {
                    if ((double)range > 0.0D) {
                        if (this._audioManager.getMode() == 0) {
                            return;
                        }
    
                        this._audioManager.setMode(0);
                        this._audioManager.setSpeakerphoneOn(true);
                        final int positions = this._mediaPlayer.getCurrentPosition();
    
                        try {
                            this._mediaPlayer.reset();
                            this._mediaPlayer.setAudioStreamType(3);
                            this._mediaPlayer.setVolume(1.0F, 1.0F);
                            this._mediaPlayer.setDataSource(this.context, this._playingUri);
                            this._mediaPlayer.setOnPreparedListener(new OnPreparedListener() {
                                public void onPrepared(MediaPlayer mp) {
                                    mp.seekTo(positions);
                                }
                            });
                            this._mediaPlayer.setOnSeekCompleteListener(new OnSeekCompleteListener() {
                                public void onSeekComplete(MediaPlayer mp) {
                                    mp.start();
                                }
                            });
                            this._mediaPlayer.prepareAsync();
                        } catch (IOException var5) {
                            var5.printStackTrace();
                        }
    
                        this.setScreenOn();
                    } else {
                        this.setScreenOff();
                        if (VERSION.SDK_INT >= 11) {
                            if (this._audioManager.getMode() == 3) {
                                return;
                            }
    
                            this._audioManager.setMode(3);
                        } else {
                            if (this._audioManager.getMode() == 2) {
                                return;
                            }
    
                            this._audioManager.setMode(2);
                        }
    
                        this._audioManager.setSpeakerphoneOn(false);
                        this.replay();
                    }
                } else if ((double)range > 0.0D) {
                    if (this._audioManager.getMode() == 0) {
                        return;
                    }
    
                    this._audioManager.setMode(0);
                    this._audioManager.setSpeakerphoneOn(true);
                    this.setScreenOn();
                }
            }
    
        }
    
        @TargetApi(21)
        private void setScreenOff() {
            if (this._wakeLock == null) {
                if (VERSION.SDK_INT >= 21) {
                    this._wakeLock = this._powerManager.newWakeLock(32, "AudioPlayManager");
                } else {
                    Log.e("LQR_AudioPlayManager", "Does not support on level " + VERSION.SDK_INT);
                }
            }
    
            if (this._wakeLock != null) {
                this._wakeLock.acquire();
            }
    
        }
    
        private void setScreenOn() {
            if (this._wakeLock != null) {
                this._wakeLock.setReferenceCounted(false);
                this._wakeLock.release();
                this._wakeLock = null;
            }
    
        }
    
        public void onAccuracyChanged(Sensor sensor, int accuracy) {
        }
    
        private void replay() {
            try {
                this._mediaPlayer.reset();
                this._mediaPlayer.setAudioStreamType(0);
                this._mediaPlayer.setVolume(1.0F, 1.0F);
                this._mediaPlayer.setDataSource(this.context, this._playingUri);
                this._mediaPlayer.setOnPreparedListener(new OnPreparedListener() {
                    public void onPrepared(MediaPlayer mp) {
                        mp.start();
                    }
                });
                this._mediaPlayer.prepareAsync();
            } catch (IOException var2) {
                var2.printStackTrace();
            }
    
        }
    
        public void startPlay(Context context, Uri audioUri, IAudioPlayListener playListener) {
            if (context != null && audioUri != null) {
                this.context = context;
                if (this._playListener != null && this._playingUri != null) {
                    this._playListener.onStop(this._playingUri);
                }
    
                this.resetMediaPlayer();
                this.afChangeListener = new OnAudioFocusChangeListener() {
                    public void onAudioFocusChange(int focusChange) {
                        Log.d("LQR_AudioPlayManager", "OnAudioFocusChangeListener " + focusChange);
                        if (AudioPlayManager.this._audioManager != null && focusChange == -1) {
                            AudioPlayManager.this._audioManager.abandonAudioFocus(AudioPlayManager.this.afChangeListener);
                            AudioPlayManager.this.afChangeListener = null;
                            AudioPlayManager.this.resetMediaPlayer();
                        }
    
                    }
                };
    
                try {
                    this._powerManager = (PowerManager)context.getSystemService("power");
                    this._audioManager = (AudioManager)context.getSystemService("audio");
                    if (!this._audioManager.isWiredHeadsetOn()) {
                        this._sensorManager = (SensorManager)context.getSystemService("sensor");
                        this._sensor = this._sensorManager.getDefaultSensor(8);
                        this._sensorManager.registerListener(this, this._sensor, 3);
                    }
    
                    this.muteAudioFocus(this._audioManager, true);
                    this._playListener = playListener;
                    this._playingUri = audioUri;
                    this._mediaPlayer = new MediaPlayer();
                    this._mediaPlayer.setOnCompletionListener(new OnCompletionListener() {
                        public void onCompletion(MediaPlayer mp) {
                            if (AudioPlayManager.this._playListener != null) {
                                AudioPlayManager.this._playListener.onComplete(AudioPlayManager.this._playingUri);
                                AudioPlayManager.this._playListener = null;
                                AudioPlayManager.this.context = null;
                            }
    
                            AudioPlayManager.this.reset();
                        }
                    });
                    this._mediaPlayer.setOnErrorListener(new OnErrorListener() {
                        public boolean onError(MediaPlayer mp, int what, int extra) {
                            AudioPlayManager.this.reset();
                            return true;
                        }
                    });
                    this._mediaPlayer.setDataSource(context, audioUri);
                    this._mediaPlayer.setAudioStreamType(3);
                    this._mediaPlayer.prepare();
                    this._mediaPlayer.start();
                    if (this._playListener != null) {
                        this._playListener.onStart(this._playingUri);
                    }
                } catch (Exception var5) {
                    var5.printStackTrace();
                    if (this._playListener != null) {
                        this._playListener.onStop(audioUri);
                        this._playListener = null;
                    }
    
                    this.reset();
                }
            } else {
                Log.e("LQR_AudioPlayManager", "startPlay context or audioUri is null.");
            }
    
        }
    
        public void setPlayListener(IAudioPlayListener listener) {
            this._playListener = listener;
        }
    
        public void stopPlay() {
            if (this._playListener != null && this._playingUri != null) {
                this._playListener.onStop(this._playingUri);
            }
    
            this.reset();
        }
    
        private void reset() {
            this.resetMediaPlayer();
            this.resetAudioPlayManager();
        }
    
        private void resetAudioPlayManager() {
            if (this._audioManager != null) {
                this.muteAudioFocus(this._audioManager, false);
            }
    
            if (this._sensorManager != null) {
                this._sensorManager.unregisterListener(this);
            }
    
            this._sensorManager = null;
            this._sensor = null;
            this._powerManager = null;
            this._audioManager = null;
            this._wakeLock = null;
            this._playListener = null;
            this._playingUri = null;
        }
    
        private void resetMediaPlayer() {
            if (this._mediaPlayer != null) {
                try {
                    this._mediaPlayer.stop();
                    this._mediaPlayer.reset();
                    this._mediaPlayer.release();
                    this._mediaPlayer = null;
                } catch (IllegalStateException var2) {
                }
            }
    
        }
    
        public Uri getPlayingUri() {
            return this._playingUri;
        }
    
        @TargetApi(8)
        private void muteAudioFocus(AudioManager audioManager, boolean bMute) {
            if (VERSION.SDK_INT < 8) {
                Log.d("LQR_AudioPlayManager", "muteAudioFocus Android 2.1 and below can not stop music");
            } else if (bMute) {
                audioManager.requestAudioFocus(this.afChangeListener, 3, 2);
            } else {
                audioManager.abandonAudioFocus(this.afChangeListener);
                this.afChangeListener = null;
            }
    
        }
    
        static class SingletonHolder {
            static AudioPlayManager sInstance = new AudioPlayManager();
    
            SingletonHolder() {
            }
        }
    }

    AudioStateMessage

    package com.badao.audiodemo.audioHelper;
    
    public class AudioStateMessage {
        public int what;
        public Object obj;
    
        public AudioStateMessage() {
        }
    
        public static AudioStateMessage obtain() {
            return new AudioStateMessage();
        }
    }

    IAudioPlayListener

    package com.badao.audiodemo.audioHelper;
    
    import android.net.Uri;
    
    public interface IAudioPlayListener {
        void onStart(Uri var1);
    
        void onStop(Uri var1);
    
        void onComplete(Uri var1);
    }

    IAudioRecordListener

    package com.badao.audiodemo.audioHelper;
    
    import android.net.Uri;
    
    public interface IAudioRecordListener {
        void initTipView();
    
        void setTimeoutTipView(int var1);
    
        void setRecordingTipView();
    
        void setAudioShortTipView();
    
        void setCancelTipView();
    
        void destroyTipView();
    
        void onStartRecord();
    
        void onFinish(Uri var1, int var2);
    
        void onAudioDBChanged(int var1);
    }

    IAudioState

    package com.badao.audiodemo.audioHelper;
    
    public abstract class IAudioState {
        public IAudioState() {
        }
    
        void enter() {
        }
    
        abstract void handleMessage(AudioStateMessage var1);
    }

    录音功能实现

    打开MainActivity,声明audioRecordManager

    private AudioRecordManager audioRecordManager;

    然后在onCreate中对其进行初始化

        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setContentView(R.layout.activity_main);
    
            audioRecordManager = AudioRecordManager.getInstance(MainActivity.this);
            File file = new File(MainActivity.this.getExternalFilesDir("voice").getAbsolutePath());
            if (!file.exists()) {
                file.mkdirs();
            }
            //设置录音文件保存路径
            audioRecordManager.setAudioSavePath(file.getAbsolutePath());
            //设置监听器
            audioRecordManager.setAudioRecordListener(recordListener);

    初始化之后设置其录音文件的的保存路径和监听器

    所以需要先声明并初始化一个监听器

        private final IAudioRecordListener recordListener = new IAudioRecordListener() {
    
            @Override
            public void initTipView() {
    
            }
    
            @Override
            public void setTimeoutTipView(int i) {
    
            }
    
            @Override
            public void setRecordingTipView() {
    
            }
    
            @Override
            public void setAudioShortTipView() {
    
            }
    
            @Override
            public void setCancelTipView() {
    
            }
    
            @Override
            public void destroyTipView() {
    
            }
    
            @Override
            public void onStartRecord() {
    
            }
    
            @Override
            public void onFinish(Uri uri, int i) {
                File file = new File(uri.getPath());
                //获取文件时长
                int voiceDuration = 0;
                try {
                    meidaPlayer.setDataSource(uri.getPath());
                    meidaPlayer.prepare();
                    int time = meidaPlayer.getDuration();//获得了时长(以毫秒为单位)
                    voiceDuration = time / 1000;
                    if (voiceDuration < 1) {
                        voiceDuration = 1;
                    }
                } catch (IOException e) {
                    e.printStackTrace();
                } finally {
                    meidaPlayer.reset();
                }
                ChatBean.ChatItem chatItem = new ChatBean.ChatItem();
    
                chatItem.setId((int) System.currentTimeMillis());
                chatItem.setSendTime(new Date().toString());
                chatItem.setContent(file.getAbsolutePath());
                //存储语音文件时长
                chatItem.setVoiceDuration(voiceDuration);
    
                chatItemList.add(chatItem);
                chatAdapter.setmEntityList(chatItemList);
    
            }
    
            @Override
            public void onAudioDBChanged(int i) {
    
            }
        };

    在监听器中重写的onFinish方法就是录音结束后的回调方法,Uri就是录音文件的路径

    在此方法中获取录音的时长以及设置一些其他参数,然后将其通过Adapter给RecyclerView进行赋值。

    关于Android中使用Adapter(适配器)给RecycleView设置数据源:

    参考如下:

    https://blog.csdn.net/BADAO_LIUMANG_QIZHI/article/details/110926089

    然后在MainActivity中的imageView的Touch事件中

            audioImageView.setOnTouchListener(new View.OnTouchListener() {
                @Override
                public boolean onTouch(View view, MotionEvent motionEvent) {
                    if(motionEvent.getAction() == MotionEvent.ACTION_DOWN)
                    {
                        Toast.makeText(MainActivity.this,"录音开始",Toast.LENGTH_SHORT).show();
                        audioRecordManager.startRecord();
                        return true;
                    }else if(motionEvent.getAction() == MotionEvent.ACTION_UP)
                    {
                        Toast.makeText(MainActivity.this,"录音结束",Toast.LENGTH_SHORT).show();
                        audioRecordManager.stopRecord();
                        return true;
                    }
                    return false;
                }
            });

    实现开始录音与结束录音。

    录音播放功能

    参照上面录音结束后使用adapter给recyclerView设置数据源的方式,在ChatAdapter中重写的

    onBindViewHolder中

        @Override
        public void onBindViewHolder(@NonNull ChatViewHolder holder, int position) {
            holder.mText.setText(mEntityList.get(position).getContent().toString());
            //设置每一项的点击事件
            holder.itemView.setOnClickListener(new View.OnClickListener() {
                @Override
                public void onClick(View view) {
                    File file = new File(mEntityList.get(position).getContent());
                    //播放音频文件
                    audioPlayManager.startPlay(App.context, Uri.fromFile(file), new IAudioPlayListener() {
                        @Override
                        public void onStart(Uri uri1) {
    
                        }
    
                        @Override
                        public void onStop(Uri uri1) {
    
                        }
    
                        @Override
                        public void onComplete(Uri uri1) {
    
                        }
                    });
                }
            });
        }

    设置每一项的点击事件并播放音频文件

    在播放的方法startPlay中第一个参数为context,为了获取Context,参照如下

    Android中怎样在工具类中获取Context对象:

    https://blog.csdn.net/BADAO_LIUMANG_QIZHI/article/details/110817765

    示例代码下载

    https://download.csdn.net/download/BADAO_LIUMANG_QIZHI/13609812

    展开全文
  • [html] input上传图片怎样触发默认拍照功能 使用 capture 属性,capture 的值可以是: camera 打开摄像头 user 打开前置摄像头 environment 打开后置摄像头 以上几个属性都不能保证设备会按照设置的一样打开...

    [html] input上传图片怎样触发默认拍照功能

    使用 capture 属性,capture 的值可以是:
    
        camera 打开摄像头
    
        user 打开前置摄像头
    
        environment 打开后置摄像头
        以上几个属性都不能保证设备会按照设置的一样打开前置或后置摄像头,如果设备不支持的话,它会使用默认的调用摄像头的行为。
    
        camcorder 打开录像
    
        microphone 打开录音机
    
    

    个人简介

    我是歌谣,欢迎和大家一起交流前后端知识。放弃很容易,
    但坚持一定很酷。欢迎大家一起讨论

    主目录

    与歌谣一起通关前端面试题

    展开全文
  • [html] input上传图片怎样触发默认拍照功能使用 capture 属性,capture 的值可以是: camera 打开摄像头 user 打开前置摄像头 environment 打开后置摄像头 以上几个属性都不能保证设备会按照设置的一样...
  • 最近在做android多媒体开发,已经实践过录像和录音功能。现在项目要求,android录制视频时,将原始视频数据保存下来,也就是yuv420sp格式的视频,以便以后对这个原始数据进一步的编码和修改。 最近查资料知道可以...
  • 日常生活中大家都会使用浏览器或者播放器打开一些视频或者听一些歌曲,当我们在听到好听的声音时,想下载到本地,却找不到下载入口,尤其是那些收费平台下载歌曲还要会员,并且受到...QVE音频剪辑-一款多功能录音软.
  • 二次开发电话录音盒开发接口包

    热门讨论 2011-03-29 09:48:00
    o 软件摘/挂机拨号,来电弹屏相关 o 各种设备型号的功能控制接口 o 单路开发模块提供语音识别控制 o 多路开发包提供软交换控制模块 子佩电话录音盒来电号码显示支持哪些制式?准确精度达到什么程度?是否支持二次...
  • 在使用audition录音后,如何处理人声,使我们录音后...怎样使用压缩器,让我们的人声能够达到比较理想的效果,本视频教程有比较详尽的讲解,对于使用audition录音以及音频处理都有相当的帮助,使你豁然开朗,喜乐满足。
  • Microsoft OneNote可以把各种记录整合在一起,例如OneNote文档可以同时包含文本,会议录音,通过电脑编辑的流程图,UML图等,也可以包含手工画的草图。其搜索功能非常强大,包括文本搜索,语言搜索,甚至在图片中...
  • Android语音文件speex编码解码(一)

    千次阅读 2017-02-21 16:24:24
    Android中使用AudioRecord录音后的格式为pcm,要想播放需要转换格式,可以加入44字节的头转换为wav格式后播放,并且在网络上传输最好把音频压缩,压缩为speex文件方便传输,节省流量,下面讲解如何生成speex的so库,...
  • 产品管理笔记

    2018-06-17 10:29:25
    明确组织架构2、初始调研,不谈设计,不要瞎想用户需求3、需要考虑维护需求:配置、运维、升级4、问有没有竞品,例如之前是否已经在使用一套系统,问哪些功能点、模块比较常用的、不好用的。5、调研养成录音习惯。6...
  • 114和118114

    千次阅读 2009-05-28 08:15:00
    网上流传一个笑话,说一个科学家极力向人推介他的新发明:“我发明的手机可高级了,什么收音机、录音机、mp3、电子书、照相机等功能一应俱全,还能当GPS使用。不光如此,在走夜路遇到坏人时,你按下这个按钮,它还能...
  • LINUX 24学时教程

    2011-10-21 18:33:38
    6.3.3 怎样使用管道 93 6.4 建立shell的命令脚本程序 95 6.5 课时小结 98 6.6 专家答疑 98 6.7 练习题 98 第7学时 使用X窗口系统 99 7.1 启动X 99 7.1.1 使用不同的颜色深度启动X11 100 7.1.2 在X11环境中使用虚拟...
  • 多媒体教室

    2013-06-14 08:10:31
    图形按钮区按钮为亮黑色时表示此功能可以使用,为灰色时表示此功能不能使用,为凹下状态表示该功能正在执行。当按下某按钮执行某一功能后,再按 Break 键使这一正在执行的功能停止执行。可以使用系统设置下热键设置...
  • 使用紧急修复磁盘ERD的功能修复系统了。  (2)网络安全模式:和安全模式类似,但是增加了对网络连接的支持。在 局域网环境中解决Windows XP的启动故障,此选项很有用。  (3)命令提示符的安全模式:也和安全...
  • 实例010 具有提示功能的工具栏 9 1.3 状态栏设计 10 实例011 在状态栏中显示检查框 10 实例012 带进度条的状态栏 11 实例013 状态栏中加入图标 12 1.4 导航菜单界面 12 实例014 OutLook界面 12 实例...
  • C#.net_经典编程例子400个

    热门讨论 2013-05-17 09:25:30
    329 实例238 列出系统中的打印机 330 7.12 其他 332 实例239 两种信息发送方式 332 实例240 功能快捷键 336 第8章 注册表 339 8.1 操作注册表 340 实例241 怎样存取注册表信息 ...
  •  实例100 使用FileSystemWatcher组件监视系统日志文件是否被更改 140  3.5 HelpProvider组件 142  实例101 使用HelpProvider组件调用帮助文件 142  3.6 Process组件 143  实例102 使用Process组件访问...
  • 实例265 怎样调用外部的EXE文件 361 实例266 关闭外部已开启的程序 362 7.10 程序运行 363 实例267 防止程序多次运行 363 实例268 程序运行时禁止关机 364 实例269 获取任务栏尺寸大小 365 实例270 改变系统提示信息...
  • C#程序开发范例宝典(第2版).part02

    热门讨论 2012-11-12 07:55:11
    一部久享盛誉的程序开发宝典。精选570个典型范例,全面覆盖实用和热点技术,涉及面...实例265 怎样调用外部的EXE文件 361 实例266 关闭外部已开启的程序 362 7.10 程序运行 363 实例267 防止程序多次运行 363 实例...
  • C#程序开发范例宝典(第2版).part13

    热门讨论 2012-11-12 20:17:14
    一部久享盛誉的程序开发宝典。精选570个典型范例,全面覆盖实用和热点技术,涉及面...实例265 怎样调用外部的EXE文件 361 实例266 关闭外部已开启的程序 362 7.10 程序运行 363 实例267 防止程序多次运行 363 实例...
  • 一部久享盛誉的程序开发宝典。精选570个典型范例,全面覆盖实用和热点技术,涉及面...实例265 怎样调用外部的EXE文件 361 实例266 关闭外部已开启的程序 362 7.10 程序运行 363 实例267 防止程序多次运行 363 实例...
  • 一部久享盛誉的程序开发宝典。精选570个典型范例,全面覆盖实用和热点技术,涉及面...实例265 怎样调用外部的EXE文件 361 实例266 关闭外部已开启的程序 362 7.10 程序运行 363 实例267 防止程序多次运行 363 实例...
  • 一部久享盛誉的程序开发宝典。精选570个典型范例,全面覆盖实用和热点技术,涉及面...实例265 怎样调用外部的EXE文件 361 实例266 关闭外部已开启的程序 362 7.10 程序运行 363 实例267 防止程序多次运行 363 实例...
  • 程序开发范例宝典>>

    2012-10-24 10:41:28
    实例108 使用FileSystemWatcher组件监视系统日志文件是否被更改 157 3.5 HelpProvider组件 158 实例109 使用HelpProvider组件调用帮助文件 159 3.6 Process组件 159 实例110 使用Process组件访问本地...
  • 一部久享盛誉的程序开发宝典。精选570个典型范例,全面覆盖实用和热点技术,涉及面...实例265 怎样调用外部的EXE文件 361 实例266 关闭外部已开启的程序 362 7.10 程序运行 363 实例267 防止程序多次运行 363 实例...

空空如也

空空如也

1 2 3
收藏数 45
精华内容 18
关键字:

怎样使用录音功能