场景
在Android中怎样实现类似发送语音消息功能前的按键录音并将其显示在RecyclerView上并且能点击录音文件进行播放。
注:
关注公众号
霸道的程序猿
获取编程相关电子书、教程推送与免费下载。
实现
实现页面布局
新建一个项目,实现页面布局如下
布局代码:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:app="http://schemas.android.com/apk/res-auto">
<androidx.recyclerview.widget.RecyclerView
android:id="@+id/recycler"
android:layout_width="match_parent"
android:layout_height="wrap_content"
app:layout_constraintBottom_toTopOf="@+id/layout_bottom"
app:layout_constraintTop_toTopOf="parent" />
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="101dp"
android:id="@+id/layout_bottom"
android:background="#F7F7F7"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintStart_toStartOf="parent">
<ImageView
android:layout_width="66dp"
android:layout_height="66dp"
android:id="@+id/img_voice"
android:background="@mipmap/badao"
android:layout_centerInParent="true"
/>
</RelativeLayout>
</androidx.constraintlayout.widget.ConstraintLayout>
并且在MainActivity中获取这些控件,并且设置下面ImageView的按下和松开的事件
public class MainActivity extends AppCompatActivity {
private ImageView audioImageView;
private RecyclerView mRecyclerView;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
audioImageView = findViewById(R.id.img_voice);
audioImageView.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View view, MotionEvent motionEvent) {
if(motionEvent.getAction() == MotionEvent.ACTION_DOWN)
{
Toast.makeText(MainActivity.this,"录音开始",Toast.LENGTH_SHORT).show();
return true;
}else if(motionEvent.getAction() == MotionEvent.ACTION_UP)
{
Toast.makeText(MainActivity.this,"录音结束",Toast.LENGTH_SHORT).show();
return true;
}
return false;
}
});
}
赋予录音权限
打开AndroidManifest.xml
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
封装录音相关工具类
在包下新建audioHelper包,包下新建如下几个接口和类
AudioRecordManager
package com.badao.audiodemo.audioHelper;
import android.annotation.TargetApi;
import android.content.Context;
import android.content.res.Resources;
import android.media.AudioManager;
import android.media.MediaRecorder;
import android.net.Uri;
import android.os.Build;
import android.os.Handler;
import android.os.Message;
import android.os.SystemClock;
import android.telephony.PhoneStateListener;
import android.telephony.TelephonyManager;
import android.text.TextUtils;
import android.util.Log;
import java.io.File;
public class AudioRecordManager implements Handler.Callback {
private static final String TAG = "LQR_AudioRecordManager";
private int RECORD_INTERVAL;
private String SAVE_PATH;
private IAudioState mCurAudioState;
private Context mContext;
private Handler mHandler;
private AudioManager mAudioManager;
private MediaRecorder mMediaRecorder;
private Uri mAudioPath;
private long smStartRecTime;
private AudioManager.OnAudioFocusChangeListener mAfChangeListener;
IAudioState idleState;
IAudioState recordState;
IAudioState sendingState;
IAudioState cancelState;
IAudioState timerState;
private IAudioRecordListener mAudioRecordListener;
public static AudioRecordManager mInstance;
public static AudioRecordManager getInstance(Context context) {
if (mInstance == null) {
Class var1 = AudioRecordManager.class;
synchronized(AudioRecordManager.class) {
if (mInstance == null) {
mInstance = new AudioRecordManager(context);
}
}
}
return mInstance;
}
@TargetApi(21)
private AudioRecordManager(Context context) {
this.mContext = context;
this.mHandler = new Handler(this);
this.RECORD_INTERVAL = 60;
this.idleState = new AudioRecordManager.IdleState();
this.recordState = new AudioRecordManager.RecordState();
this.sendingState = new AudioRecordManager.SendingState();
this.cancelState = new AudioRecordManager.CancelState();
this.timerState = new AudioRecordManager.TimerState();
if (Build.VERSION.SDK_INT < 21) {
try {
TelephonyManager e = (TelephonyManager)this.mContext.getSystemService("phone");
e.listen(new PhoneStateListener() {
public void onCallStateChanged(int state, String incomingNumber) {
switch(state) {
case 1:
AudioRecordManager.this.sendEmptyMessage(6);
case 0:
case 2:
default:
super.onCallStateChanged(state, incomingNumber);
}
}
}, 32);
} catch (SecurityException var3) {
var3.printStackTrace();
}
}
this.mCurAudioState = this.idleState;
this.idleState.enter();
}
public final boolean handleMessage(Message msg) {
Log.i("LQR_AudioRecordManager", "handleMessage " + msg.what);
AudioStateMessage m;
switch(msg.what) {
case 2:
this.sendEmptyMessage(2);
break;
case 7:
m = AudioStateMessage.obtain();
m.what = msg.what;
m.obj = msg.obj;
this.sendMessage(m);
break;
case 8:
m = AudioStateMessage.obtain();
m.what = 7;
m.obj = msg.obj;
this.sendMessage(m);
}
return false;
}
private void initView() {
if (this.mAudioRecordListener != null) {
this.mAudioRecordListener.initTipView();
}
}
private void setTimeoutView(int counter) {
if (this.mAudioRecordListener != null) {
this.mAudioRecordListener.setTimeoutTipView(counter);
}
}
private void setRecordingView() {
if (this.mAudioRecordListener != null) {
this.mAudioRecordListener.setRecordingTipView();
}
}
private void setCancelView() {
if (this.mAudioRecordListener != null) {
this.mAudioRecordListener.setCancelTipView();
}
}
private void destroyView() {
Log.d("LQR_AudioRecordManager", "destroyTipView");
this.mHandler.removeMessages(7);
this.mHandler.removeMessages(8);
this.mHandler.removeMessages(2);
if (this.mAudioRecordListener != null) {
this.mAudioRecordListener.destroyTipView();
}
}
public void setMaxVoiceDuration(int maxVoiceDuration) {
this.RECORD_INTERVAL = maxVoiceDuration;
}
public void setAudioSavePath(String path) {
if (TextUtils.isEmpty(path)) {
this.SAVE_PATH = this.mContext.getCacheDir().getAbsolutePath();
} else {
this.SAVE_PATH = path;
}
}
public int getMaxVoiceDuration() {
return this.RECORD_INTERVAL;
}
public void startRecord() {
this.mAudioManager = (AudioManager)this.mContext.getSystemService("audio");
if (this.mAfChangeListener != null) {
this.mAudioManager.abandonAudioFocus(this.mAfChangeListener);
this.mAfChangeListener = null;
}
this.mAfChangeListener = new AudioManager.OnAudioFocusChangeListener() {
public void onAudioFocusChange(int focusChange) {
Log.d("LQR_AudioRecordManager", "OnAudioFocusChangeListener " + focusChange);
if (focusChange == -1) {
AudioRecordManager.this.mAudioManager.abandonAudioFocus(AudioRecordManager.this.mAfChangeListener);
AudioRecordManager.this.mAfChangeListener = null;
AudioRecordManager.this.sendEmptyMessage(6);
}
}
};
this.sendEmptyMessage(1);
if (this.mAudioRecordListener != null) {
this.mAudioRecordListener.onStartRecord();
}
}
public void willCancelRecord() {
this.sendEmptyMessage(3);
}
public void continueRecord() {
this.sendEmptyMessage(4);
}
public void stopRecord() {
this.sendEmptyMessage(5);
}
public void destroyRecord() {
AudioStateMessage msg = new AudioStateMessage();
msg.obj = true;
msg.what = 5;
this.sendMessage(msg);
}
void sendMessage(AudioStateMessage message) {
this.mCurAudioState.handleMessage(message);
}
void sendEmptyMessage(int event) {
AudioStateMessage message = AudioStateMessage.obtain();
message.what = event;
this.mCurAudioState.handleMessage(message);
}
private void startRec() {
Log.d("LQR_AudioRecordManager", "startRec");
try {
this.muteAudioFocus(this.mAudioManager, true);
this.mAudioManager.setMode(0);
this.mMediaRecorder = new MediaRecorder();
try {
int bps = 7950;
this.mMediaRecorder.setAudioSamplingRate(8000);
this.mMediaRecorder.setAudioEncodingBitRate(bps);
} catch (Resources.NotFoundException var2) {
var2.printStackTrace();
}
this.mMediaRecorder.setAudioChannels(1);
this.mMediaRecorder.setAudioSource(1);
this.mMediaRecorder.setOutputFormat(3);
this.mMediaRecorder.setAudioEncoder(1);
this.mAudioPath = Uri.fromFile(new File(this.SAVE_PATH, System.currentTimeMillis() + "temp.voice"));
this.mMediaRecorder.setOutputFile(this.mAudioPath.getPath());
this.mMediaRecorder.prepare();
this.mMediaRecorder.start();
Message e1 = Message.obtain();
e1.what = 7;
e1.obj = 10;
this.mHandler.sendMessageDelayed(e1, (long)(this.RECORD_INTERVAL * 1000 - 10000));
} catch (Exception var3) {
var3.printStackTrace();
}
}
private boolean checkAudioTimeLength() {
long delta = SystemClock.elapsedRealtime() - this.smStartRecTime;
return delta < 1000L;
}
private void stopRec() {
Log.d("LQR_AudioRecordManager", "stopRec");
try {
this.muteAudioFocus(this.mAudioManager, false);
if (this.mMediaRecorder != null) {
this.mMediaRecorder.stop();
this.mMediaRecorder.release();
this.mMediaRecorder = null;
}
} catch (Exception var2) {
var2.printStackTrace();
}
}
private void deleteAudioFile() {
Log.d("LQR_AudioRecordManager", "deleteAudioFile");
if (this.mAudioPath != null) {
File file = new File(this.mAudioPath.getPath());
if (file.exists()) {
file.delete();
}
}
}
private void finishRecord() {
Log.d("LQR_AudioRecordManager", "finishRecord path = " + this.mAudioPath);
if (this.mAudioRecordListener != null) {
int duration = (int)(SystemClock.elapsedRealtime() - this.smStartRecTime) / 1000;
this.mAudioRecordListener.onFinish(this.mAudioPath, duration);
}
}
private void audioDBChanged() {
if (this.mMediaRecorder != null) {
int db = this.mMediaRecorder.getMaxAmplitude() / 600;
if (this.mAudioRecordListener != null) {
this.mAudioRecordListener.onAudioDBChanged(db);
}
}
}
private void muteAudioFocus(AudioManager audioManager, boolean bMute) {
if (Build.VERSION.SDK_INT < 8) {
Log.d("LQR_AudioRecordManager", "muteAudioFocus Android 2.1 and below can not stop music");
} else if (bMute) {
audioManager.requestAudioFocus(this.mAfChangeListener, 3, 2);
} else {
audioManager.abandonAudioFocus(this.mAfChangeListener);
this.mAfChangeListener = null;
}
}
public IAudioRecordListener getAudioRecordListener() {
return this.mAudioRecordListener;
}
public void setAudioRecordListener(IAudioRecordListener audioRecordListener) {
this.mAudioRecordListener = audioRecordListener;
}
class IdleState extends IAudioState {
public IdleState() {
Log.d("LQR_AudioRecordManager", "IdleState");
}
void enter() {
super.enter();
if (AudioRecordManager.this.mHandler != null) {
AudioRecordManager.this.mHandler.removeMessages(7);
AudioRecordManager.this.mHandler.removeMessages(8);
AudioRecordManager.this.mHandler.removeMessages(2);
}
}
void handleMessage(AudioStateMessage msg) {
Log.d("LQR_AudioRecordManager", "IdleState handleMessage : " + msg.what);
switch(msg.what) {
case 1:
AudioRecordManager.this.initView();
AudioRecordManager.this.setRecordingView();
AudioRecordManager.this.startRec();
AudioRecordManager.this.smStartRecTime = SystemClock.elapsedRealtime();
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.recordState;
AudioRecordManager.this.sendEmptyMessage(2);
default:
}
}
}
class RecordState extends IAudioState {
RecordState() {
}
void handleMessage(AudioStateMessage msg) {
Log.d("LQR_AudioRecordManager", this.getClass().getSimpleName() + " handleMessage : " + msg.what);
switch(msg.what) {
case 2:
AudioRecordManager.this.audioDBChanged();
AudioRecordManager.this.mHandler.sendEmptyMessageDelayed(2, 150L);
break;
case 3:
AudioRecordManager.this.setCancelView();
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.cancelState;
case 4:
default:
break;
case 5:
final boolean checked = AudioRecordManager.this.checkAudioTimeLength();
boolean activityFinished = false;
if (msg.obj != null) {
activityFinished = (Boolean)msg.obj;
}
if (checked && !activityFinished) {
if (AudioRecordManager.this.mAudioRecordListener != null) {
AudioRecordManager.this.mAudioRecordListener.setAudioShortTipView();
}
AudioRecordManager.this.mHandler.removeMessages(2);
}
if (!activityFinished && AudioRecordManager.this.mHandler != null) {
AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
public void run() {
AudioStateMessage message = AudioStateMessage.obtain();
message.what = 9;
message.obj = !checked;
AudioRecordManager.this.sendMessage(message);
}
}, 500L);
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.sendingState;
} else {
AudioRecordManager.this.stopRec();
if (!checked && activityFinished) {
AudioRecordManager.this.finishRecord();
}
AudioRecordManager.this.destroyView();
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
}
break;
case 6:
AudioRecordManager.this.stopRec();
AudioRecordManager.this.destroyView();
AudioRecordManager.this.deleteAudioFile();
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
AudioRecordManager.this.idleState.enter();
break;
case 7:
int counter = (Integer)msg.obj;
AudioRecordManager.this.setTimeoutView(counter);
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.timerState;
if (counter > 0) {
Message message = Message.obtain();
message.what = 8;
message.obj = counter - 1;
AudioRecordManager.this.mHandler.sendMessageDelayed(message, 1000L);
} else {
AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
public void run() {
AudioRecordManager.this.stopRec();
AudioRecordManager.this.finishRecord();
AudioRecordManager.this.destroyView();
}
}, 500L);
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
}
}
}
}
class SendingState extends IAudioState {
SendingState() {
}
void handleMessage(AudioStateMessage message) {
Log.d("LQR_AudioRecordManager", "SendingState handleMessage " + message.what);
switch(message.what) {
case 9:
AudioRecordManager.this.stopRec();
if ((Boolean)message.obj) {
AudioRecordManager.this.finishRecord();
}
AudioRecordManager.this.destroyView();
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
default:
}
}
}
class CancelState extends IAudioState {
CancelState() {
}
void handleMessage(AudioStateMessage msg) {
Log.d("LQR_AudioRecordManager", this.getClass().getSimpleName() + " handleMessage : " + msg.what);
switch(msg.what) {
case 1:
case 2:
case 3:
default:
break;
case 4:
AudioRecordManager.this.setRecordingView();
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.recordState;
AudioRecordManager.this.sendEmptyMessage(2);
break;
case 5:
case 6:
AudioRecordManager.this.stopRec();
AudioRecordManager.this.destroyView();
AudioRecordManager.this.deleteAudioFile();
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
AudioRecordManager.this.idleState.enter();
break;
case 7:
int counter = (Integer)msg.obj;
if (counter > 0) {
Message message = Message.obtain();
message.what = 8;
message.obj = counter - 1;
AudioRecordManager.this.mHandler.sendMessageDelayed(message, 1000L);
} else {
AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
public void run() {
AudioRecordManager.this.stopRec();
AudioRecordManager.this.finishRecord();
AudioRecordManager.this.destroyView();
}
}, 500L);
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
AudioRecordManager.this.idleState.enter();
}
}
}
}
class TimerState extends IAudioState {
TimerState() {
}
void handleMessage(AudioStateMessage msg) {
Log.d("LQR_AudioRecordManager", this.getClass().getSimpleName() + " handleMessage : " + msg.what);
switch(msg.what) {
case 3:
AudioRecordManager.this.setCancelView();
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.cancelState;
case 4:
default:
break;
case 5:
AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
public void run() {
AudioRecordManager.this.stopRec();
AudioRecordManager.this.finishRecord();
AudioRecordManager.this.destroyView();
}
}, 500L);
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
AudioRecordManager.this.idleState.enter();
break;
case 6:
AudioRecordManager.this.stopRec();
AudioRecordManager.this.destroyView();
AudioRecordManager.this.deleteAudioFile();
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
AudioRecordManager.this.idleState.enter();
break;
case 7:
int counter = (Integer)msg.obj;
if (counter > 0) {
Message message = Message.obtain();
message.what = 8;
message.obj = counter - 1;
AudioRecordManager.this.mHandler.sendMessageDelayed(message, 1000L);
AudioRecordManager.this.setTimeoutView(counter);
} else {
AudioRecordManager.this.mHandler.postDelayed(new Runnable() {
public void run() {
AudioRecordManager.this.stopRec();
AudioRecordManager.this.finishRecord();
AudioRecordManager.this.destroyView();
}
}, 500L);
AudioRecordManager.this.mCurAudioState = AudioRecordManager.this.idleState;
}
}
}
}
}
AudioPlayManager
package com.badao.audiodemo.audioHelper;
import android.annotation.TargetApi;
import android.content.Context;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.media.AudioManager.OnAudioFocusChangeListener;
import android.media.MediaPlayer.OnCompletionListener;
import android.media.MediaPlayer.OnErrorListener;
import android.media.MediaPlayer.OnPreparedListener;
import android.media.MediaPlayer.OnSeekCompleteListener;
import android.net.Uri;
import android.os.PowerManager;
import android.os.Build.VERSION;
import android.os.PowerManager.WakeLock;
import android.util.Log;
import java.io.IOException;
public class AudioPlayManager implements SensorEventListener {
private static final String TAG = "LQR_AudioPlayManager";
private MediaPlayer _mediaPlayer;
private IAudioPlayListener _playListener;
private Uri _playingUri;
private Sensor _sensor;
private SensorManager _sensorManager;
private AudioManager _audioManager;
private PowerManager _powerManager;
private WakeLock _wakeLock;
private OnAudioFocusChangeListener afChangeListener;
private Context context;
public AudioPlayManager() {
}
public static AudioPlayManager getInstance() {
return AudioPlayManager.SingletonHolder.sInstance;
}
@TargetApi(11)
public void onSensorChanged(SensorEvent event) {
float range = event.values[0];
if (this._sensor != null && this._mediaPlayer != null) {
if (this._mediaPlayer.isPlaying()) {
if ((double)range > 0.0D) {
if (this._audioManager.getMode() == 0) {
return;
}
this._audioManager.setMode(0);
this._audioManager.setSpeakerphoneOn(true);
final int positions = this._mediaPlayer.getCurrentPosition();
try {
this._mediaPlayer.reset();
this._mediaPlayer.setAudioStreamType(3);
this._mediaPlayer.setVolume(1.0F, 1.0F);
this._mediaPlayer.setDataSource(this.context, this._playingUri);
this._mediaPlayer.setOnPreparedListener(new OnPreparedListener() {
public void onPrepared(MediaPlayer mp) {
mp.seekTo(positions);
}
});
this._mediaPlayer.setOnSeekCompleteListener(new OnSeekCompleteListener() {
public void onSeekComplete(MediaPlayer mp) {
mp.start();
}
});
this._mediaPlayer.prepareAsync();
} catch (IOException var5) {
var5.printStackTrace();
}
this.setScreenOn();
} else {
this.setScreenOff();
if (VERSION.SDK_INT >= 11) {
if (this._audioManager.getMode() == 3) {
return;
}
this._audioManager.setMode(3);
} else {
if (this._audioManager.getMode() == 2) {
return;
}
this._audioManager.setMode(2);
}
this._audioManager.setSpeakerphoneOn(false);
this.replay();
}
} else if ((double)range > 0.0D) {
if (this._audioManager.getMode() == 0) {
return;
}
this._audioManager.setMode(0);
this._audioManager.setSpeakerphoneOn(true);
this.setScreenOn();
}
}
}
@TargetApi(21)
private void setScreenOff() {
if (this._wakeLock == null) {
if (VERSION.SDK_INT >= 21) {
this._wakeLock = this._powerManager.newWakeLock(32, "AudioPlayManager");
} else {
Log.e("LQR_AudioPlayManager", "Does not support on level " + VERSION.SDK_INT);
}
}
if (this._wakeLock != null) {
this._wakeLock.acquire();
}
}
private void setScreenOn() {
if (this._wakeLock != null) {
this._wakeLock.setReferenceCounted(false);
this._wakeLock.release();
this._wakeLock = null;
}
}
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
private void replay() {
try {
this._mediaPlayer.reset();
this._mediaPlayer.setAudioStreamType(0);
this._mediaPlayer.setVolume(1.0F, 1.0F);
this._mediaPlayer.setDataSource(this.context, this._playingUri);
this._mediaPlayer.setOnPreparedListener(new OnPreparedListener() {
public void onPrepared(MediaPlayer mp) {
mp.start();
}
});
this._mediaPlayer.prepareAsync();
} catch (IOException var2) {
var2.printStackTrace();
}
}
public void startPlay(Context context, Uri audioUri, IAudioPlayListener playListener) {
if (context != null && audioUri != null) {
this.context = context;
if (this._playListener != null && this._playingUri != null) {
this._playListener.onStop(this._playingUri);
}
this.resetMediaPlayer();
this.afChangeListener = new OnAudioFocusChangeListener() {
public void onAudioFocusChange(int focusChange) {
Log.d("LQR_AudioPlayManager", "OnAudioFocusChangeListener " + focusChange);
if (AudioPlayManager.this._audioManager != null && focusChange == -1) {
AudioPlayManager.this._audioManager.abandonAudioFocus(AudioPlayManager.this.afChangeListener);
AudioPlayManager.this.afChangeListener = null;
AudioPlayManager.this.resetMediaPlayer();
}
}
};
try {
this._powerManager = (PowerManager)context.getSystemService("power");
this._audioManager = (AudioManager)context.getSystemService("audio");
if (!this._audioManager.isWiredHeadsetOn()) {
this._sensorManager = (SensorManager)context.getSystemService("sensor");
this._sensor = this._sensorManager.getDefaultSensor(8);
this._sensorManager.registerListener(this, this._sensor, 3);
}
this.muteAudioFocus(this._audioManager, true);
this._playListener = playListener;
this._playingUri = audioUri;
this._mediaPlayer = new MediaPlayer();
this._mediaPlayer.setOnCompletionListener(new OnCompletionListener() {
public void onCompletion(MediaPlayer mp) {
if (AudioPlayManager.this._playListener != null) {
AudioPlayManager.this._playListener.onComplete(AudioPlayManager.this._playingUri);
AudioPlayManager.this._playListener = null;
AudioPlayManager.this.context = null;
}
AudioPlayManager.this.reset();
}
});
this._mediaPlayer.setOnErrorListener(new one rrorListener() {
public boolean one rror(MediaPlayer mp, int what, int extra) {
AudioPlayManager.this.reset();
return true;
}
});
this._mediaPlayer.setDataSource(context, audioUri);
this._mediaPlayer.setAudioStreamType(3);
this._mediaPlayer.prepare();
this._mediaPlayer.start();
if (this._playListener != null) {
this._playListener.onStart(this._playingUri);
}
} catch (Exception var5) {
var5.printStackTrace();
if (this._playListener != null) {
this._playListener.onStop(audioUri);
this._playListener = null;
}
this.reset();
}
} else {
Log.e("LQR_AudioPlayManager", "startPlay context or audioUri is null.");
}
}
public void setPlayListener(IAudioPlayListener listener) {
this._playListener = listener;
}
public void stopPlay() {
if (this._playListener != null && this._playingUri != null) {
this._playListener.onStop(this._playingUri);
}
this.reset();
}
private void reset() {
this.resetMediaPlayer();
this.resetAudioPlayManager();
}
private void resetAudioPlayManager() {
if (this._audioManager != null) {
this.muteAudioFocus(this._audioManager, false);
}
if (this._sensorManager != null) {
this._sensorManager.unregisterListener(this);
}
this._sensorManager = null;
this._sensor = null;
this._powerManager = null;
this._audioManager = null;
this._wakeLock = null;
this._playListener = null;
this._playingUri = null;
}
private void resetMediaPlayer() {
if (this._mediaPlayer != null) {
try {
this._mediaPlayer.stop();
this._mediaPlayer.reset();
this._mediaPlayer.release();
this._mediaPlayer = null;
} catch (IllegalStateException var2) {
}
}
}
public Uri getPlayingUri() {
return this._playingUri;
}
@TargetApi(8)
private void muteAudioFocus(AudioManager audioManager, boolean bMute) {
if (VERSION.SDK_INT < 8) {
Log.d("LQR_AudioPlayManager", "muteAudioFocus Android 2.1 and below can not stop music");
} else if (bMute) {
audioManager.requestAudioFocus(this.afChangeListener, 3, 2);
} else {
audioManager.abandonAudioFocus(this.afChangeListener);
this.afChangeListener = null;
}
}
static class SingletonHolder {
static AudioPlayManager sInstance = new AudioPlayManager();
SingletonHolder() {
}
}
}
AudioStateMessage
package com.badao.audiodemo.audioHelper;
public class AudioStateMessage {
public int what;
public Object obj;
public AudioStateMessage() {
}
public static AudioStateMessage obtain() {
return new AudioStateMessage();
}
}
IAudioPlayListener
package com.badao.audiodemo.audioHelper;
import android.net.Uri;
public interface IAudioPlayListener {
void onStart(Uri var1);
void onStop(Uri var1);
void onComplete(Uri var1);
}
IAudioRecordListener
package com.badao.audiodemo.audioHelper;
import android.net.Uri;
public interface IAudioRecordListener {
void initTipView();
void setTimeoutTipView(int var1);
void setRecordingTipView();
void setAudioShortTipView();
void setCancelTipView();
void destroyTipView();
void onStartRecord();
void onFinish(Uri var1, int var2);
void onAudioDBChanged(int var1);
}
IAudioState
package com.badao.audiodemo.audioHelper;
public abstract class IAudioState {
public IAudioState() {
}
void enter() {
}
abstract void handleMessage(AudioStateMessage var1);
}
录音功能实现
打开MainActivity,声明audioRecordManager
private AudioRecordManager audioRecordManager;
然后在onCreate中对其进行初始化
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
audioRecordManager = AudioRecordManager.getInstance(MainActivity.this);
File file = new File(MainActivity.this.getExternalFilesDir("voice").getAbsolutePath());
if (!file.exists()) {
file.mkdirs();
}
//设置录音文件保存路径
audioRecordManager.setAudioSavePath(file.getAbsolutePath());
//设置监听器
audioRecordManager.setAudioRecordListener(recordListener);
初始化之后设置其录音文件的的保存路径和监听器
所以需要先声明并初始化一个监听器
private final IAudioRecordListener recordListener = new IAudioRecordListener() {
@Override
public void initTipView() {
}
@Override
public void setTimeoutTipView(int i) {
}
@Override
public void setRecordingTipView() {
}
@Override
public void setAudioShortTipView() {
}
@Override
public void setCancelTipView() {
}
@Override
public void destroyTipView() {
}
@Override
public void onStartRecord() {
}
@Override
public void onFinish(Uri uri, int i) {
File file = new File(uri.getPath());
//获取文件时长
int voiceDuration = 0;
try {
meidaPlayer.setDataSource(uri.getPath());
meidaPlayer.prepare();
int time = meidaPlayer.getDuration();//获得了时长(以毫秒为单位)
voiceDuration = time / 1000;
if (voiceDuration < 1) {
voiceDuration = 1;
}
} catch (IOException e) {
e.printStackTrace();
} finally {
meidaPlayer.reset();
}
ChatBean.ChatItem chatItem = new ChatBean.ChatItem();
chatItem.setId((int) System.currentTimeMillis());
chatItem.setSendTime(new Date().toString());
chatItem.setContent(file.getAbsolutePath());
//存储语音文件时长
chatItem.setVoiceDuration(voiceDuration);
chatItemList.add(chatItem);
chatAdapter.setmEntityList(chatItemList);
}
@Override
public void onAudioDBChanged(int i) {
}
};
在监听器中重写的onFinish方法就是录音结束后的回调方法,Uri就是录音文件的路径
在此方法中获取录音的时长以及设置一些其他参数,然后将其通过Adapter给RecyclerView进行赋值。
关于Android中使用Adapter(适配器)给RecycleView设置数据源:
然后在MainActivity中的imageView的Touch事件中
audioImageView.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View view, MotionEvent motionEvent) {
if(motionEvent.getAction() == MotionEvent.ACTION_DOWN)
{
Toast.makeText(MainActivity.this,"录音开始",Toast.LENGTH_SHORT).show();
audioRecordManager.startRecord();
return true;
}else if(motionEvent.getAction() == MotionEvent.ACTION_UP)
{
Toast.makeText(MainActivity.this,"录音结束",Toast.LENGTH_SHORT).show();
audioRecordManager.stopRecord();
return true;
}
return false;
}
});
实现开始录音与结束录音。
录音播放功能
参照上面录音结束后使用adapter给recyclerView设置数据源的方式,在ChatAdapter中重写的
onBindViewHolder中
@Override
public void onBindViewHolder(@NonNull ChatViewHolder holder, int position) {
holder.mText.setText(mEntityList.get(position).getContent().toString());
//设置每一项的点击事件
holder.itemView.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
File file = new File(mEntityList.get(position).getContent());
//播放音频文件
audioPlayManager.startPlay(App.context, Uri.fromFile(file), new IAudioPlayListener() {
@Override
public void onStart(Uri uri1) {
}
@Override
public void onStop(Uri uri1) {
}
@Override
public void onComplete(Uri uri1) {
}
});
}
});
}
设置每一项的点击事件并播放音频文件