Filling Audio Buffer to Generate Waves in the PSLab Android App

The PSLab Android App works as an oscilloscope and a wave generator using the audio jack of the Android device. The implementation of the oscilloscope in the Android device using the in-built mic has been discussed in the blog post “Using the Audio Jack to make an Oscilloscope in the PSLab Android App” and the same has been discussed in the context of wave generator in the blog post “Implement Wave Generation Functionality in the PSLab Android App”. This post is a continuation of the post related to the implementation of wave generation functionality in the PSLab Android App. In this post, the subject matter of discussion is the way to fill the audio buffer so that the resulting wave generated is either a Sine Wave, a Square Wave or a Sawtooth Wave. The resultant audio buffer would be played using the AudioTrack API of Android to generate the corresponding wave. The waves we are trying to generate are periodic waves.

Periodic Wave: A wave whose displacement has a periodic variation with respect to time or distance, or both.

Thus, the problem reduces to generating a pulse which will constitute a single time period of the wave. Suppose we want to generate a sine wave; if we generate a continuous stream of pulses as illustrated in the image below, we would get a continuous sine wave. This is the main concept that we shall try to implement using code.

Initialise AudioTrack Object

AudioTrack object is initialised using the following parameters:

  • STREAM TYPE: Type of stream like STREAM_SYSTEM, STREAM_MUSIC, STREAM_RING, etc. For wave generation purposes we are using stream music. Every stream has its own maximum and minimum volume level.  
  • SAMPLING RATE: It is the rate at which the source samples the audio signal.
  • BUFFER SIZE IN BYTES: Total size of the internal buffer in bytes from where the audio data is read for playback.
  • MODES: There are two modes-
    • MODE_STATIC: Audio data is transferred from Java to the native layer only once before the audio starts playing.
    • MODE_STREAM: Audio data is streamed from Java to the native layer as audio is being played.

getMinBufferSize() returns the estimated minimum buffer size required for an AudioTrack object to be created in the MODE_STREAM mode.

minTrackBufferSize = AudioTrack.getMinBufferSize(SAMPLING_RATE, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioTrack = new AudioTrack(
       AudioManager.STREAM_MUSIC,
       SAMPLING_RATE,
       AudioFormat.CHANNEL_OUT_MONO,
       AudioFormat.ENCODING_PCM_16BIT,
       minTrackBufferSize,
       AudioTrack.MODE_STREAM);

Fill Audio Buffer to Generate Sine Wave

Depending on the values in the audio buffer, the wave is generated by the AudioTrack object. Therefore, to generate a specific kind of wave, we need to fill the audio buffer with some specific values. The values are governed by the wave equation of the signal that we want to generate.

public short[] createBuffer(int frequency) {
   short[] buffer = new short[minTrackBufferSize];
   double f = frequency;
   double q = 0;
   double level = 16384;
   final double K = 2.0 * Math.PI / SAMPLING_RATE;

   for (int i = 0; i < minTrackBufferSize; i++) {
         f += (frequency - f) / 4096.0;
         q += (q < Math.PI) ? f * K : (f * K) - (2.0 * Math.PI);
         buffer[i] = (short) Math.round(Math.sin(q));
   }
   return buffer;
}

Fill Audio Buffer to Generate Square Wave

To generate a square wave, let’s assume the time period to be t units. So, we need the amplitude to be equal to A for t/2 units and -A for the next t/2 units. Repeating this pulse continuously, we will get a square wave.

buffer[i] = (short) ((q > 0.0) ? 1 : -1);

Fill Audio Buffer to Generate Sawtooth Wave

Ramp signals increases linearly with time. A Ramp pulse has been illustrated in the image below:

We need repeated ramp pulses to generate a continuous sawtooth wave.

buffer[i] = (short) Math.round((q / Math.PI));

Finally, when the audio buffer is generated, write it to the audio sink for playback using write() method exposed by the AudioTrack object.

audioTrack.write(buffer, 0, buffer.length);

Resources

Continue ReadingFilling Audio Buffer to Generate Waves in the PSLab Android App

Implement Wave Generation Functionality in The PSLab Android App

The PSLab Android App works as an Oscilloscope using the audio jack of Android device. The implementation for the scope using in-built mic is discussed in the post Using the Audio Jack to make an Oscilloscope in the PSLab Android App. Another application which can be implemented by hacking the audio jack is Wave Generation. We can generate different types of signals on the wires connected to the audio jack using the Android APIs that control the Audio Hardware. In this post, I will discuss about how we can generate wave by using the Android APIs for controlling the audio hardware.

Configuration of Audio Jack for Wave Generation

Simply cut open the wire of a cheap pair of earphones to gain control of its terminals and attach alligator pins by soldering or any other hack(jugaad) that you can think of. After you are done with the tinkering of the earphone jack, it should look something like shown in the image below.

Source: edn.com

If your earphones had mic, it would have an extra wire for mic input. In any general pair of earphones the wire configuration is almost the same as shown in the image below.

Source: flickr

Android APIs for Controlling Audio Hardware

AudioRecord and AudioTrack are the two classes in Android that manages recording and playback respectively. For Wave Generation application we only need AudioTrack class.

Creating an AudioTrack object: We need the following parameters to initialise an AudioTrack object.

STREAM TYPE: Type of stream like STREAM_SYSTEM, STREAM_MUSIC, STREAM_RING, etc. For wave generation purpose we are using stream music. Every stream has its own maximum and minimum volume level.

SAMPLING RATE: it is the rate at which source samples the audio signal.

BUFFER SIZE IN BYTES: total size in bytes of the internal buffer from where the audio data is read for playback.

MODES: There are two modes

  • MODE_STATIC: Audio data is transferred from Java to native layer only once before the audio starts playing.
  • MODE_STREAM: Audio data is streamed from Java to native layer as audio is being played.

getMinBufferSize() returns the estimated minimum buffer size required for an AudioTrack object to be created in the MODE_STREAM mode.

private int minTrackBufferSize;
private static final int SAMPLING_RATE = 44100;
minTrackBufferSize = AudioTrack.getMinBufferSize(SAMPLING_RATE, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT);

audioTrack = new AudioTrack(
       AudioManager.STREAM_MUSIC,
       SAMPLING_RATE,
       AudioFormat.CHANNEL_OUT_MONO,
       AudioFormat.ENCODING_PCM_16BIT,
       minTrackBufferSize,
       AudioTrack.MODE_STREAM);

Function createBuffer() creates the audio buffer that is played using the audio track object i.e audio track object would write this buffer on playback stream. Function below fills random values in the buffer due to which a random signal is generated. If we want to generate some specific wave like Square Wave, Sine Wave, Triangular Wave, we have to fill the buffer accordingly.

public short[] createBuffer(int frequency) {
   // generating a random buffer for now
   short[] buffer = new short[minTrackBufferSize];
   for (int i = 0; i < minTrackBufferSize; i++) {
       buffer[i] = (short) (random.nextInt(32767) + (-32768));
   }
   return buffer;
}

We created a write() method and passed the audio buffer created in above step as an argument to the method. This method writes audio buffer into audio stream for playback.

public void write(short[] buffer) {
   /* write buffer to audioTrack */
   audioTrack.write(buffer, 0, buffer.length);
}

Amplitude of the signal can be controlled by changing the volume level of the stream on which the buffer is being played. As we are playing the audio in music stream, so STREAM_MUSIC is passed as a parameter to the setStreamVolume() method.

value: value is amplitude level of the stream. Every stream has its different amplitude levels. getStreamMaxVolume(STREAM_TYPE) method is used to find the maximum valid amplitude level of any stream.
flag: this stackoverflow post explain all the flags of the AudioManager class.

AudioManager audioManager = (AudioManager)getSystemService(Context.AUDIO_SERVICE); audioManager.setStreamVolume(AudioManager.STREAM_MUSIC, value, flag);

Roadmap

We are working on implementing methods to fill audio buffer with specific values such that waves like Sinusoidal wave, Square Wave, Sawtooth Wave can be generated during the playback of the buffer using the AudioTrack object.

Resources

Continue ReadingImplement Wave Generation Functionality in The PSLab Android App

Using the Audio Jack to make an Oscilloscope in the PSLab Android App

The PSLab Android App allows users to access functionality provided by the PSLab hardware device, but in the interest of appealing to a larger audience that may not have immediate access to the device, we’re working on implementing some additional functionalities to perform experiments using only the hardware and sensors that are available in most android phones. The mentors suggested that the audio jack (Microphone input) of phones can be hacked to make it function as an Oscilloscope. Similarly, the audio output can also be used as a 2-channel arbitrary waveform generator. So I did a little research and found some articles which described how it can be done. In this post, I will dive a bit into the following aspects –

  • AudioJack specifications for android devices
  • Android APIs that provide access to audio hardware of device
  • Integrating both to achieve scope functionality

Audio Jack specification for android devices

In a general audio jack interface, the configuration CTIA(LRGM – Left, Right, Ground, Mic) is present as shown in the image below. Some interfaces also have OMTP(LRMG – Left, Right, Mic, Ground) configuration in which the common and mic inputs are interchanged. In the image, Common refers to ground.

Source: howtogeek

If we simply cut open the wire of a cheap pair of earphones (stolen from an airplane? 😉 ) , we  will gain access to all terminals (Left, Right, Common, Mic Input) illustrated in the image below

Source: flickr

Android APIs that provide access to audio hardware of device

AudioRecord and AudioTrack are two classes in android that manage recording and playback respectively. We require only AudioRecord to implement scope functionality. We shall first create an object of the AudioRecord class, and use that object to read the audio buffer as and when required.

Creating an AudioRecord object: we need the following parameters to initialise an AudioRecord object.

SAMPLING_RATE: Almost all mobile devices support sampling rate of 44100 Hz. In this context, the definition is number of audio samples taken per second.

RECORDER_AUDIO_ENCODING: Audio encoding describes bit representation of audio data. Here we used PCM_16BIT encoding this means stream of bits generated from PCM are segregated in a set of 16 bits.

getMinimumBufferSize() returns minimum buffer size in byte units required to create an AudioRecord object successfully.

private static final int SAMPLING_RATE = 44100;
private static final int RECORDING_CHANNEL = AudioFormat.CHANNEL_IN_MONO;
private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT;
private AudioRecord audioRecord = null;
private int minRecorderBufferSize;
minRecorderBufferSize = AudioRecord.getMinBufferSize(SAMPLING_RATE, RECORDING_CHANNEL, RECORDER_AUDIO_ENCODING);
audioRecord = new AudioRecord(
       MediaRecorder.AudioSource.MIC,
       SAMPLING_RATE,
       RECORDING_CHANNEL,
       RECORDER_AUDIO_ENCODING,
       minRecorderBufferSize);

audioRecord object can be used to read audio buffer from audio hardware using read() method.

minRecorderBuffer size is in byte units and 2 bytes constitute a short in JAVA. Thus size of short buffer needed is half the total number of bytes.

short[] audioBuffer = new short[minRecorderBufferSize / 2];
audioRecord.read(audioBuffer, 0, audioBuffer.length);

Now audioBuffer has the audio data as a signed 16 bit values. We need to process the buffer data and plot the processed data points on chart to completely implement scope functionality. I am still looking for relation between the signed 16-bit value of audio buffer and actual mic bias voltage. According to android headset specs, Mic bias voltage is between 1.8-2.9V.

Using AudioRecord class to create a scope in PSLab Android

In PSLab Android App, there is already an Oscilloscope made to capture and plot the data received from PSLab device. To make a cheap oscilloscope, cut open the wire of a cheap headset and expose terminals as illustrated in the image above and provide input signal at microphone input terminal.

Note: Don’t provide a voltage more than 2V at mic input terminal, it can damage your android device. To be sure check peak voltage from external voltmeter of the signal that you want to apply on scope and if it’s greater than 2V, I suggest you to first make a voltage divider to lower the voltage and then you are good to go.

To integrate plotting of audio buffer, we simply need to create another thread that captures audio data and updates the UI with the processed buffer data.

public class captureAudioBuffer extends AsyncTask<Void, Void, Void> {

        private AudioJack audioJack;
        private short[] buffer; 
        public captureAudioBuffer(AudioJack audioJack) {
            this.audioJack = audioJack;
        }

        @Override
        protected Void doInBackground(Void... params) {
            buffer = audioJack.read();
            Log.v("AudioBuffer", Arrays.toString(buffer));
            audioJack.release();
            return null;
        }

        @Override
        protected void onPostExecute(Void aVoid) {
            super.onPostExecute(aVoid);
            // UPDATE UI ACCORDING TO READ BUFFER DATA 
            Log.v("Execution Done", "Completed");
        }
    }

For complete code of AudioJack class, please refer pslab-android-app.

Resources

Continue ReadingUsing the Audio Jack to make an Oscilloscope in the PSLab Android App