Object-Oriented OS2 Audio Device Driver Samples

From EDM2
Jump to: navigation, search

Reprint Courtesy of International Business Machines Corporation, © International Business Machines Corporation

Introduction

The object-oriented (OO) OS/2 Audio Device Driver Sample is provided to make it much easier for you to develop an OS/2 audio driver. It is intended to serve as a roadmap for developing an OS/2 audio device driver.

Unlike the existing ProAudio Spectrum 16 sample, which supports Wave, Wavetable, and FM synthesis, this new OO audio sample supports all technologies with much less code. This new sample, which supports the Turtle Beach "Tropez Plus" sound card, is a far more adaptable sample.

Some of the highlights this sample includes are:

  • Full-duplex capability.
  • Real-time MIDI operation that runs at task time context hooks.
  • Wave, Wavetable, and FM synthesis technologies are clearly encapsulated into distinct modules in the new sample. They can be mixed and matched as needed.
  • This sample is coded with the Watcom 10.6 compiler. Most of the existing samples (including the PAS16) are written for a version of the Microsoft "C" compiler which is no longer available in the marketplace.
  • This sample exploits the OS/2 Resource Manager for identification and allocation of I/O resources for ISA Plug and Play adapters.

This document is organized to provide you with:

  • A design overview of the OO sample
  • A roadmap for adapting the sample code to your hardware

This sample is oriented toward ISA bus, Plug and Play hardware. The target OS/2 level is OS/2 Warp Version 4. OS/2 Warp Version 3 is supported only through special-case handling of the I/O assignments, and is specific to the hardware in the sample. Performance tracing, if used, requires an Intel® Pentium® processor.

A number of topics of interest are not covered by this sample.

These topics and where you can find more information follow:

  • End-customer installation: An audio driver installation sample is provided in the existing OS/2 DDK in the SRC\DEV\MME\AUDINST directory.
  • Power management: Refer to the OS/2 Physical Device Driver Reference.
  • RTMIDI input interface: This topic is covered by the existing MPU-401 sample in SRC\DEV\MME\MPU401 directory.

Object-Oriented Technology

Object-oriented (OO) tools were used to design and implement this sample. Included in this documentation is a drawing that depicts the overall design. You need some familiarity with standard OO model notation and OO terminology to understand the drawing and the discussion that follows.

An OO approach was used primarily for its:

  • Encapsulation (scoping and interface definitions)
  • Generalization (object derivations and inheritance)
  • Polymorphism (function behavior is dependent on object type)

The built-in memory management of the C++ programming environment also provides for more readable code.

This sample is built on a call/return model. It does not utilize message passing, which is often associated with an OO model. Also, the sample does not use C++ exceptions.

Watcom C/C++ for OS/2 Physical Device Drivers

There are a number of items to consider when using the Watcom C/C++ compiler to write OS/2 device drivers. First, it will be useful to refer to the Watcom C/C++ Programmer's Guide and its discussion on developing an OS/2 physical device driver. That documentation gives an outline for using Watcom C/C++ for writing OS/2 drivers in C. The additional notes here provide tips for using the extended features of the C++ language (as compared to the C language) in the OS/2 driver environment.

Subset of C++ Language Features Available in the OS/2 Driver Environment

Many C++ language features require runtime libraries that are supplied by the compiler vendor. These libraries do not work in the Ring 0 kernel environment and you must take care not to link with the Watcom libraries. The link flags used in the makefile of the sample illustrate how to do this.

The sample re-implements a small subset of these libraries to make some features available. In particular, the "new" and "delete" operators are implemented to resolve to calls to malloc() and free(). The sample also implements the dynamic heap management which malloc() and free() control.

Among the C++ language features that are not available are:

  • Globally-scoped objects which use a user-defined constructor are not available. The construction of such an object would have to occur before the general initialization of the device driver, and this sample does not include the support for such a feature.
  • virtual destructors
  • exceptions
  • The "new []" and "delete []" operators are not implemented in the current sample. Consequently, allocations of arrays from the heap must be handled manually, first by a call to malloc() and followed by casting the address of the allocation into a pointer to the array.

The Debugging Environment and the WAT2MAP Utility

If you are an experienced OS/2 driver developer, you probably have been using the OS/2 kernel debugger for your debug activities. Debug symbols are provided by running the MAPSYM utility against the output listing of the Microsoft linker. Watcom C++ must be linked with the Watcom linker, and unfortunately, the listing file generated by the Watcom linker is not compatible with the MAPSYM utility. Consequently, something must be done to obtain symbols.

There are two solutions to this problem. First, there is a new debugger available for the OS/2 Ring 0 environment. Named "ICAT", this debugger offers some nice features, including source code views of your remote debug. The ICAT debugger is compatible with the Watcom linker and full symbolic information is available.

If you prefer to use the OS/2 kernel debugger, a small REXX utility named WAT2MAP.CMD is provided. This utility converts the output of the Watcom linker into a format which MAPSYM understands. There are a number of limitations and rules on how the symbols are translated (the OS/2 kernel debug facility does not understand many identifiers that are legitimate parts of C++ symbolic names, such as the "::" scoping operator.) Reference the file header in the WAT2MAP.CMD for additional information.

Coding Conventions

In general, the sample uses a leading underscore on the names of member data and member functions that are private.

The .HPP and .CPP files are paired. The .HPP file defines the interface between the module and the rest of the driver, while the .CPP file contains the implementation of the object. For example, the ResourceManager (RM) object is defined in RM.HPP. Its implementation is in RM.CPP.

The symbol "###" is used in the source code to flag suggestions for future work.

Design Overview

This section presents an overview of the object-oriented design.

How to Read the Picture

The overall design is depicted in the illustration that follows. There are more object definitions in the code base than what is shown in the illustration; however, the key points of the overall design are represented.

OOAUDIO.GIF

Only a few of the key data members and member functions are shown in the illustration. Refer to the .HPP file in the source code for a complete definition. The names of the files are similar to the names of the objects they contain.

Some of the boxes in the illustration are not really C++ objects. Quotes are used around the names in these boxes.

Modules in the illustration which are not true objects can be:

  • Files which contain procedural code ("Init", "Mix_Dev Workers", "CODEC Workers", "IOCtl interface")
  • Modules which describe an important set of interfaces ("IOCtl interface", "SHDD interface")
  • Modules which are a globally-scoped pointer to an important list ("Audio HW List", "Stream List"). If container classes were available, these modules would have been implemented as containers.

Solid lines depict object associations, and usually mean that there are imbedded pointers which link the two objects. Dotted lines indicate call / return relationships.

Key Classes

The two key classes, from an overall system-design perspective, are the Stream class and the AudioHW class. The Stream class encapsulates the many complex interfaces between the OS/2 Multimedia Presentation Manager/2 (MMPM/2) subsystem and the driver, and resolves these operations into a small collection of public methods that are defined in the AudioHW class (and its derived MIDI and Wave classes). Streams are created when a Ring 3 process starts to play or record media. No Streams exist during driver initialization; driver initialization is always completed before any Streams are created. Streams are destroyed when the application closes the driver. Following initialization, any number of Streams can exist at any time, up to the limit of the audio driver's memory heap.

AudioHW classes encapsulate the algorithms to drive an audio hardware function. A Stream uses an AudioHW object to manipulate the hardware. AudioHW objects implement the public methods that Streams expect to drive.

All AudioHW classes provide Start() and Stop() operations. Additional functions are added as the hardware class is refined into a given class. For example, all MIDI hardware classes implement the NoteOn() and NoteOff() operations, while all Wave classes implement a ConfigureDevice() operation.

AudioHW objects are created at driver initialization time, based on the hardware features that are identified by the ResourceManager. Hardware objects exist for the duration of the device driver. There is either one hardware object per hardware feature, or two hardware objects per hardware feature if play / record is handled separately.

The Stream class is a generalization of all streams. The Stream class is refined into both a MidiStream and a WaveStream. Likewise, the AudioHW class is a generalization of all audio hardware. As was just mentioned, the AudioHW class is refined into a MIDI class. The MIDI class is then further refined into an MPU_401 class and an FM synthesis class. The AudioHW class is also refined into a generalized Wave class, then further refined into a class for play (WavePlay) and another for record (WaveRecord).

Association of Stream and AudioHW Objects

Only the final refinements (derivations) of the stream and hardware classes are actually created (instantiated).

On the hardware side, for the Tropez Plus card, there is one Wave CODEC (a CS4232), one OPL3-compatible FM synthesizer, a Wavetable device with an MPU-401 interface, and an additional MPU-401 port for external MIDI devices. The initialization code in the driver sample creates one WavePlay object and one WaveRecord object for operating the CODEC. Additionally, an MPU401 object is created for controlling the Wavetable device, and an FMSynth object is created for controlling the FM synthesizer.

MIDI streams and Wave streams are created on demand from application requests; none are created at initialization time. Following initialization, there can be any number of paused or stopped streams associated with one hardware object. However, there can be no more than one running stream associated with a hardware object at any time. For discussion purposes, the term "current focus" is used to describe the association that is currently active between an AudioHW object and a stream.

The association between streams and hardware objects is set up at stream-creation time during the processing of the AudioInit IOCtl. The media type specified by MMPM/2 is matched against available hardware objects and an appropriate selection is made. The stream.pahw (pointer to audio hardware) data member points a Stream to its AudioHW association. AudioHW objects can determine their "current focus" (this usually needs to be done during the hardware's interrupt service routine) by calling the globally-scoped FindActiveStream() function.

Container Classes

The AudioHW list and Stream list are shown as composition classes in the OO model. These classes contain the lists of all audio hardware objects and all stream objects, respectively, that exist at the current time. In fact, these list (or container) objects are not C++ objects. The lists are pointed to by the pAudioHWList and pStreamList global variables. You can learn more by reviewing the QUEUEHEAD and QUEUEELEMENT classes (refer to file QUEUE.HPP).

Adaptation Roadmap

With the Stream and AudioHW objects reviewed, you can turn your attention to the remaining objects that are represented in the illustration. Several of these classes will require some adaptation for your product. This section provides a step-by-step guide to changing the source code for these classes.

CODEC Worker Class

The CODEC worker class routines are a good place to begin changing the source code. The sample implements the CODEC workers as a set of simple, globally-scoped worker routines that perform the low-level I/O. These worker routines are utilized by the Wave classes (WavePlay, WaveRecord), the Timer class, and Mixer workers (Mix_Dev module) to implement a variety of features. The Init and the worker routines in this module need to be modified to operate your hardware. The sample code, which drives the CS4232 CODEC, is found in the file CS4232.CPP. The following functions are found in that file. In general, you will need all of these services, but you will need to modify these functions to operate your hardware:

  • void InitCS4232(USHORT BaseAddress) - This function is called from StrategyInit (INIT.CPP) at device driver init time. Its purpose it to initialize the CODEC. The function performs the following:
    • Resets and calibrates the CODEC
    • Initializes the mixer registers
    • Calls into the WavePlay and WaveRecord objects, so that these Wave objects can add their interrupt service routines to the IRQ object
    • Sets up the CODEC hardware to allow interrupts
  • UCHAR WaitForInitComplete(void) - Certain operations cause the CS4232 to enter a reset state, and while in this state, the CODEC will not respond to the bus. This function waits for this state to clear.
  • WaitForACIBit(void) - This function waits for the CS4232 to complete a calibrate command.
  • UCHAR GetReg(USHORT index) - This function fetches data from one of the four "direct" registers. A direct register is one that is directly accessible via bus I/O.
  • UCHAR SetReg(USHORT index) - This function writes data into one
  • UCHAR GetIndexReg(UCHAR index) - This function fetches data from one of the 32 "indirect" registers. An indirect register is one that is internal to the CODEC and is not accessible via bus I/O.
  • UCHAR SetIndexReg(UCHAR index) - This function writes data into one of the 32 "indirect" registers.

Line Class (for Mixer Features)

Both the OS/2 MMPM/2 MultiMedia subsystem and the sample treat the mixer as a collection of lines. A line can be thought of as an analog signal which is either an input (a "Source") or an output (a "Sink"). Lines can be connected to each other, and they have controls which can be adjusted.

A Line class has been defined to encapsulate this abstraction. The Line class defines generic connect and control functions, and handles the interface between the audio driver and MMPM/2. The Mix_Dev workers translate these generic commands into the appropriate calls to the CODEC worker routines. Once your CODEC workers are in place, this is the next set of worker routines to adapt for your target hardware.

The files LINE.HPP and LINE.CPP contain the definitions, member functions, global data, and global functions used to implement the mixer. The OS/2 mixer-related definitions and functions are in OS2MIXER.H. There is no Mixer class defined in the source code base. The mixer is implemented as a set of Line objects, where each Line object manages the connections and controls of a specific source or sink.

All of the source code in LINE.CPP should be usable without change. However, you will need to review and update three tables to adapt the sample for your device. A discussion of these tables and the functions that use them follows.

ControlIndex[]

ControlIndex is an array of class control_index. MMPM/2 defines a "ulControl" in both the MIXERCONTROL and MIXERLINEINFO structures (see OS2MIXER.H). The values in this field are used by MMPM/2 to specify specific controls (such as volume or gain). These values are not sequential. However, within the sample, a sequential set of integers is needed to reference these attributes. The ControlIndex array provides the mapping between the MMPM/2 control values, and the control indices used within the sample.

The ControlIndex[] array is defined in LINE.CPP and is used by the GetControlIndex function. ControlIndex will need to be extended if your device supports controls other than mute, gain, volume, monitor, and balance.

lineindex[]

The lineindex array is an array of class line_index.

MMPM/2 defines a set of line identifiers for the set of generic sources and sinks that were anticipated by the MMPM/2 designers. The lineindex[] array is used to map the MMPM/2 line values (found in the ulLine field of the MIXERLINEINFO, LINECONNECTIONS, and MIXERCONTROL structures (OS2MIXER.H)) to the Line objects that the sample creates to control the actual lines defined by the device.

The lineindex array is defined in LINE.CPP and should be updated to reflect the lines for your device. You need to delete members from this table where your target device does not support a particular line. Likewise, you will need to add members to this table if your target device supports lines that are understood by MMPM/2 but not used by the sample.

The InitMixer function (file LINE.CPP) sets up the association between the line number and the correct line object.

The LineIndex array is used by the globally-scoped function, GetLineIndex.

init_lines[]

The init_lines array holds the information that is used to create each Line object. It is used by the InitMixer function and the Line class constructor to build and initialize all of the Line objects. The init_lines[] array contains a dev_lineinfo[] object for every line that needs to be created. The init_lines array resides in the init data segment and is discarded after initialization. A dev_lineinfo object contains the following information:

class dev_lineinfo {
public:
   ULONG ulNumChannels; // the number of channels for this line

   ULONG ulSupport;     // a bit mask of all the mixer controls this line
                        // supports. see flags for ulsupport in OS2MIXER.H
                        // if the line supports MIX_VOLUME and MIX_MUTE the
                        // value of this field would be 0x00080040

   ULONG ulCanConnect;  // a bit mask of all the lines to which this line can connect
                        // see source and sink definitions in OS2MIXER.H
                        // if this line could connect to SINK_SPEAKER and
                        // SINK_HEADPHONES, this field would be 0x00600000

   ULONG ulLine;        // the MMPM2 line number for this line
                        // if this were the SOURCE_SYNTHESIZER the value
                        // would be 0x00000001

   ULONG ulCallConnect; // a bit mask of all the lines that require a function
                        // to be called in order to connect to them...
                        // Note: some connections are simply there by virtue
                        // of on-chip wiring, like connecting the SOURCE_WAVE
                        // to the SINK_SPEAKER, however in order to connect
                        // the SOURCE_LINE (line in) to the SINK_WAVE
                        // (input to the ADC). The input mux must be set up.

   ULONG ulCallMute;    // a bit mask of lines that can connected and
                        // disconnected by muting and unmuting them.
                        // one could disconnect SOURCE_INTERNAL_AUDIO
                        // (CD Audio In) from SINK_SPEAKER by muting the
                        // input from the CD.

   ULONG ulMyLine;      // the index of this line (see LINE.HPP)
};

The ulCallMute, ulCallConnect, and ulSupport member data items determine whether a mixer device worker needs to be called, and if so, what kind of service needs to be requested. The target hardware illustrates this point. An examination of the mixer hardware on the CS4232 chip shows that only lines that are being connected to the Analog-to-Digital Converter (ADC) require any register manipulation. The implementation of the Tropez Plus card (the target adapter for the sample) is such that only the microphone and line-in require connect/disconnect to be performed on them. All other connect/disconnect commands are turned into mute/unmute control commands by the Line class (as flagged by the ulCallMute element).

Mix_Dev ("Mixer Device") Workers

The mix_dev worker routines are used to implement the Connect and Control (adjustment) commands received from MMPM/2. When a Line object determines that an I/O operation on the mixer hardware is needed, a mix_dev worker routine is called to perform the I/O.

The source files, MIX_DEV.HPP and MIX_DEV.CPP, contain the "mixer device" worker routines and associated definitions. All mixer operations are performed by one of four primitive operations. The sample exploits this by providing the mix_dev worker routines as table-driven functions. You will need to update these tables. You may need to update the primitive functions.

The key tables are the Connect[] array, which is composed of class ConnectInfo members, and Controls[] array, which is composed of class ControlInfo. Both tables are static from the time of build, and the values can be manipulated by directly editing the source.

A Connect member entry contains the following information:

class ConnectInfo {
public:
    USHORT Source;      // the source index
    USHORT Sink;        // the sink index
    USHORT Reg1;        // the register address of the first register
                        // all the registers used in the mix_dev routines are
                        // indexed registers
                        // Data used for the first register
                        // the register is read and the data is then:
    USHORT ConAndMask1; // And'd when doing a connect
    USHORT ConOrMask1;  // Or'd  when doing a connect
    USHORT DisAndMask1; // And'd when doing a disconnect
    USHORT DisOrMask1;  // Or'd when doing a disconnect
    USHORT Reg2;        // the register address of the second register
                        // (0xffff when only 1 register)
    USHORT ConAndMask2; // Add'l regs for stereo, see above....
    USHORT ConOrMask2;  //
    USHORT DisAndMask2; //
    USHORT DisOrMask2;  //
};

A Controls[] member is specified as follows:

class ControlInfo {
public:
    USHORT Line;        // index of the line
    USHORT Control;     // control to be done
                        // pointer to the function that does the control
    USHORT (* pFunc)(USHORT,ULONG);
    USHORT Reg1;        // index of first register
    USHORT Reg2;        // index of second register (0xffff if only 1 reg)
    USHORT AndMask1;    // And Mask (if required)
    USHORT OrMask1;     // Or Mask (if required)
    USHORT AndMask2;
    USHORT OrMask2;
    void * ControlData; // pointer to either an array used to resolve
                        // the value to be written into the hardware.
};

MixDevConnect() implements one of the primitive mixer device functions, and MixDevControl() implements the other three. MixDevConnect() performs the connect and disconnect commands for the Line class. MixDevControl() performs commands that perform adjustments, such as unmuting the CD audio input and increasing its volume. Each of these functions is presented here:

  • USHORT MixDevConnect(USHORT Source, USHORT Sink, USHORT Operation) - MixDevConnect receives connect requests from the Line objects. This function will probably not require any adaptation once the Connect[] table is properly set up.


MixDevConnect resolves the request into a series of register read / register write commands. Each Connect command uses the appropriate ConnectInfo object entry in the Connect array (MIX_DEV.CPP). The ConnectInfo object holds the information that MixDevConnect() needs to do the operation. The Source and Sink parameters are actually the index of the lines (obtained from GetLineIndex) to be acted upon. The Operation parameter determines if this is a connection (non-zero) or disconnection (zero). The Source and Sink are passed to Getcontrol(), which returns the index of the ConnectInfo object or BADRC. The register address along with the "and" and "or" fields in the ConnectInfo class are then used to perform the operation.

  • USHORT MixDevControl(USHORT Line, USHORT Control, ULONG Value) - MixDevControl() receives control requests from Line objects, and implements the primitive functions for:
    • Adjustment (for example, volume change)
    • Bit-specific controls (mute / unmute)
    • Special commands (such as turn on record monitor)
In an analogous fashion to MixDevConnect(), there must be a ControlInfo class in the Controls array for every command that gets sent in. This function will probably not require any adaptation once the Connect[] table is properly set up; however, you will need to review the low level workers which this function calls.
The ControlInfo class not only has the register indices and bit masks (like the ConnectInfo), but also contains the address of an array used to convert the Value parameter into a value to be set into the register, and the address of the function that will perform the operation. The three types of control operations are mapped to these three control functions. You need to review these routines for adaptation needs:
  • USHORT DoIntAdjOp(USHORT index, ULONG value) - Performs register adjustment commands like volume, and gain
  • USHORT DoIntBoolOp(USHORT index, ULONG value) - Performs bit-specifc commands line mute/unmute
  • USHORT DoMonitorOp(USHORT index, ULONG value)- Enables/disables the record monitor

Initialization and ResourceManager

With the CODEC and Mixer workers in place, it's time to initialize your hardware and try to get the first sounds from your system.

The ResourceManager is a fairly substantial object. It provides some simple member functions which you will use to determine whether your hardware is present in the system, and if so, what I/O, DMA, IRQ and other resources you should be using. The ResourceManager exists only during initialization. The Plug and Play ID of your product is used as the handle to identify your hardware.

Note that the ResourceManager support provided here applies only to the ISA environment.

You need to understand both the LDev_Resources and ResourceManager classes to obtain the resource information. Both classes are defined in the file RM.HPP. A description of these functions follows:

The LDev_Resources ("Logical Device Resources") class is intended to hold the resource information for a single logical device. It contains four arrays. The first two arrays contain the base I/O address and number of contiguous ports for an I/O port assignment. The remaining two arrays provide the IRQ and DMA channel assignments, respectively. An LDev_Resources object follows these rules:

  • Any unused member in an array is guaranteed to be set to -1. The value -1 is chosen since the value 0 can be a legitimate value for some I/O.
  • All the non-empty members of an array appear before any empty member. Once an empty member of an array is found, you can ignore all elements with higher index values.
  • The ordering of I/O, IRQ, and DMA information within an array is consistent with the order in which the resource information was reported by the hardware resource.
  • You can check for an empty LDev_Resources object by calling the isEmpty() method.

The ResourceManager class is provided to encapsulate all I/O resource detection and assignments. The ResourceManager uses the OS/2 Resource Manager to obtain conflict-free resource assignments, and then allocate these assignments for the sample's exclusive use. The OS/2 Resource Manager provides resource detection, however, only in Warp Version 4 or later. Consequently, this sample provides a template for automatic resource detection only in Warp Version 4 or later. OS/2 Warp 3.0 support is discussed below.

You will need to make some minor adaptations to the ResourceManager object before using it. The vendor information in the vendor identification strings need to be changed to refer to your product and your company. Also, the number of I/O decode lines need to be changed to reflect the operation of your hardware. The source code lines are flagged with the comment "### IHV" in file RM.CPP.

With these updates in place, you are ready to use the ResourceManager to obtain I/O resources. The sample shows how the ResourceManager object is created in the StrategyInit routine (file INIT.CPP). As shown in the sample, there are two methods you will need to use to obtain the resource information:

  • bIsDetected() - This is a Boolean function that reports whether the hardware feature is detected as present in the system. This method accepts the Plug and Play ID of your logical devices as a parameter.
  • pGetDevResources() - This is a function that returns an LDev_Resources object which contains the I/O assignments for a logical device. Again, you supply the Plug and Play ID of your logical device as a parameter.

Use the identified I/O resources to initialize your CODEC. Initialize any other hardware required to run the mixer. Then try enabling an external line (for example, CD, Line) and verify mixer functions such as mute and volume. The OO sample will work with the Mixer application found on the IBM Developer Connection. Alternatively, you might choose to simply hardcode these functions into the Init of your driver.

Resource Management on Warp 3.0

In order to provide some level of support in the Warp 3.0 environment, the ResourceManager class also provides for accepting user specifications of the I/O assignments from the DEVICE= statement in CONFIG.SYS. However, the processing of these overrides uses the unique "SLAM" feature of CS4232 chip. The SLAM feature bypasses Plug and Play interactions and directly programs the hardware feature to a set of I/O resources.

The SLAM operation and user override functions are encapsulated into a Slam object and an Override class, respectively. The Override class defines a set of I/O defaults for the various logical devices, while the Slam class encapsulates the algorithms for SLAM'ing the CS4232. These classes are all unique to the operation of the target hardware of this sample.

If the SLAM operations are used, the vRun() method of the ResourceManager object must also be invoked to complete the configuration of the chip. Again, this is specific to the CS4232.

A better approach to operation on Warp 3.0 would be to interface with the system Plug and Play BIOS (if Plug and Play BIOS is available). Under this approach, the ResourceManager would obtain resource information from the system BIOS on a Warp 3.0 environment. However, this level of support is not provided in the current sample.

MIDI

You can now try to operate MIDI. The MIDI hardware classes (MPU401, FMSynth) should work without modification if the OS/2 system timer is used to generate the timer support. For now, force the use of the OS/2 system timer by placing a "/O:NoHWTimer" option on the DEVICE= statement in CONFIG.SYS. More will be said later about using hardware timer features that your audio chip set might provide.

Create the MPU401 or FMSynth hardware objects at Init time using the I/O resource information provided by the ResourceManager. Unmute the mixer lines, and listen for sound.

Wave

The sample is structured so that play and record operations are implemented in separate classes. Adapting the Wave classes for your device is normally the biggest step of the adaptation process. This section provides an overview of the development steps you need to perform.

WAVEAUDIO

Both play and record classes are derived from a common parent class WAVEAUDIO (files WAUDIO.*PP). The WAVEAUDIO class defines the data and functions that are common to both the play and record operations. You will need to make a number of adaptations to this class.

The files WAUDIO.*PP contain several definitions which are specific to the sample target hardware. For example, frequency and clock select values for the CODEC are defined in these files. These definitions are flagged by the text "CS4232". You will need to update or replace these definitions so they match your target hardware.

WAVEAUDIO and Supported Riff Data Types

The sample supports 8-bit PCM, 16-bit PCM (two's complement), Mu-Law, and A-Law data types. If your device supports some other set of data types, then you will need to make changes to support your set of data types.

The sample defines a set integer handles which are used to select a particular audio hardware object. MMPM/2 uses a different set of handles. The mapping between the two sets is defined by the HardwareArray[] data structure in AUDIOHW.CPP. If you support more data types (or fewer data types) than the sample, you need to make additions (or deletions) to this table.

This table is defined statically. However, additional entries can be set up at initialization time by using the SetHardwareType() function. This function is used by the sample, for example, to control whether the FM synthesis or Wavetable device will appear as the default MIDI device.

In addition to updating HardwareArray[], you will also need to update the DevCaps() and ConfigDev() member functions of the WAVEAUDIO class.

WAVEAUDIO::DevCaps()

If you support more data types (or fewer data types) than the sample, you need to make additions (or deletions) to the case statement that handles the various data types. You should also read the function for possible impacts if your "bits per sample" value is not 8-bit or 16-bit.

If your device does not support full-duplex, you will need to make a change to reflect this. The change is documented and flagged by the comments labeled "Full Duplex Enabling".

WAVEAUDIO::ConfigDev( WaveConfigInfo* )

If you support more data types (or fewer data types) than the sample, you need to make additions (or deletions) to the case statement that handles the various data types.

The file WAUDIO.HPP defines a WaveConfigInfo structure, which is used as part of the WAVESTREAM class definition. The WaveConfigInfo structure contains configuration parameters that are specific to a single instance of a stream, such as sampling rates. Each instance of a Wave hardware object needs access to these parameters prior to streaming a new data type to the CODEC. The ConfigDev() member function is used by the WAVESTREAM class to pass these parameters from the stream to the hardware object. ConfigDev() is always called by the stream before the stream calls Start().

ConfigDev() uses the WaveConfigInfo to compute values for the WAVEAUDIO private data members _ucClockData, _ucFormatData, and _usCountData. ConfigDev() also calculates the values for WaveConfigInfo data members ulPCMConsumeRate, ulBytesPerIRQ, and usSilence, and saves these values in the WaveConfigInfo structure that was passed by a pointer. These computed values are used later by the WAVESTREAM, during the play or record operation.

The algorithms which process these configuration parameters must be adapted to the specifications for your CODEC.

WAVEAUDIO::_vset_clock_info() and Sampling Rates

The _vset_clock_info() function computes the clock value which is the best match between the sampling rates supported by the CODEC and the sampling rate of a particular stream. This function is called by WAVEAUDIO::ConfigDev(), and the result (a value which selects a clock frequency) is saved in the WAVEAUDIO::_ucClockData data member. This clock select value is written into the CODEC at a later point, when a play or record operation is started. The _vset_clock_info() function uses the frequency table defined by the Freq_table[] array in WAUDIO.CPP.

You will need to adapt _vset_clock_info() and Freq_table[] for your target hardware. You may need to replace this algorithm and data structure with an entirely different algorithm, depending on your CODEC specification.

You will also find a private member function named _usfind_matching_sample_rate(). This function finds the closest match between the sampling rate for a particular stream, and the predefined sampling rates which are defined by MMPM/2. You will not need to change this function.

Associations Between Wave and Other Objects

In the design of this sample, a Wave class is associated with one DMACHANNEL (files DMA.*PP), one IRQ (IRQ.*PP), and one AUDIOBUFFER (AUDIOBUF.*PP). There should be no changes required in any of these associated classes.

The associations between the Wave object and the DMA, IRQ, and Audio buffer objects are made by the _vSetup member function. This setup function is called from both the WAVEPLAY and WAVEREC constructors. The _vSetup() function instantiates (creates) an AUDIOBUFFER, a DMACHANNEL, and an IRQ. If your device uses one DMA channel and one IRQ, then this function does not need to be changed. If, on the other hand, your device has different resource requirements, you will need to adapt _vSetup() for your situation.

Adaptation of the WAVEPLAY and WAVEREC Classes

Once the WAVEAUDIO class has been adapted for your hardware, all that is left is the encoding of the specific I/O sequences required to start or stop the Wave operations of your CODEC. This is handled by the WAVEPLAY and WAVEREC classes.

WAVEPLAY and WAVEREC are defined in WAVEHW.HPP. No changes are needed in this file. The same four methods are defined for each of these two classes:

  • a constructor
  • Enable()
  • Start()
  • Stop()

The implementation of WAVEPLAY and WAVEREC are in .CPP files of the same name. In addition to implementations of the above named functions, each of WAVEPLAY and WAVEREC files contain an interrupt service routine which is private within the scope of the file.

The constructor should not require modification. The Enable() member function will not require modification unless you change the name of the interrupt service routine. The interrupt service routines for the WAVEPLAY and WAVEREC classes check to see if their interrupt is active, then check for an active stream of the correct type, and finally, call the Process() member function in the correct WAVESTREAM object. These routines will need adaptation to the interrupt service protocols for your device.

The Start() and Stop() member functions for the WAVEPLAY and WAVEREC classes are almost entirely device specific. You will need to replace the I/O sequence performed by the sample with the I/O sequences that are appropriate for your device.

Hardware Timers for MIDI

The Timer class is provided to maintain the real-time clock for MIDI streams. It uses real-time ticks from either the OS/2 system kernel, or preferably, from an onboard hardware timer on your audio feature. The Timer class provides a "scheduling" function to the associated MIDI stream. The MIDI stream provides a Timer object with the time at which the stream needs to perform additional processing. The Timer object keeps track of the passage of time, and then invokes the MIDI stream's Process() method at the specified time. The Process() method executes in Ring0, task time (not interrupt time) context. This is accomplished by exploiting OS/2's "context hook" feature.

The Timer class will work without modification if the OS/2 system timer is used. This will provide a timing resolution of 31 milliseconds. If you have a hardware timer on your audio feature, it is preferable to exploit it. Use of your timer provides both a higher interrupt resolution and a more periodic interrupt rate. To adapt the Timer class for your hardware, you will need to modify both the Timer constructor as well as the static private functions defined within the Timer class (_vStartHWTicks(), _vStopHWTicks(), etc).

Adding Parameters to the DEVICE= statement

You may have a need for your end users to supply parameters on the DEVICE= statement in the CONFIG.SYS file. You will find that there is considerable support for doing this in the file PARSE.C. Reference that file for additional information.

The Log Class

The Log object provides the functionality of the "ddprintf()" utility in earlier audio driver samples, plus quite a bit more. In particular, the BINARYLOG refinement of the Log object will log into the system trace buffer in binary format. This provides an extremely fast trace - on the order of 20 microseconds to log a message with two words of trace data on a 166 MHz Pentium processor. By contrast, the same operation using a ddprintf() on the serial line requires on the order of 20 milliseconds. The PERFLOG object provides for performance tracing using the RTDSC instruction of the Pentium processor.

File Versioning

The development of your driver is complete. Before shipping the driver to the field, however, check the vendor and file versioning information in the file \DDK\BASE\H\VERSION.MAK. Make sure the variables _OEM and _VERSION are set to your company name and the correct field version identifier for your driver. These strings will be built into the header of the driver, and can be viewed by your customer by using the BLDLEVEL command.

Summary and Conclusion

This sample is the first known application of OO technology to an OS/2 physical device driver. The authors adopted this approach in order to be able to easily mix and match different combinations of Wave, Mixer, and MIDI technologies. The structure of this sample lends itself to that objective, and the authors believe the sample provides an excellent base for developing other audio drivers.

Comments or Feedback?

Your comments and feedback on this document are welcome and can be directed via E-mail to the following:

  • Rich Jerant, IBM Corporation, jerant@us.ibm.com
  • Kip Harris, IBM Corporation, hkip@us.ibm.com
  • Brent Davis, IBM Corporation, bldavis@vnet.ibm.com

Trademarks

The following are trademarks of International Business Machines Corporation in the United States, or other countries, or both:

  • IBM
  • Multimedia Presentation Manager/2
  • OS/2

Microsoft, Windows, and the Windows 95 logo are trademarks or registered trademarks of Microsoft Corporation.

Other company, product, and service names may be trademarks or service marks of others.

© Copyright International Business Machines Corporation 1997. All rights reserved.

Object-Oriented OS/2 PCI Audio Device Driver Sample

Introduction

The new and modified object-oriented (OO) OS/2 PCI Audio Device Driver Sample is provided to make it easier for you to develop an OS/2 PCI audio driver, using the existing OO OS/2 Audio Device Driver Model. It is intended to serve as a roadmap for developing an OS/2 PCI audio device driver.

This sample exploits the OS/2 Resource Manager for identification and allocation of I/O resources for PCI Plug and Play adapters. This sample driver contains sample code to detect audio devices from the PCI bus and to create, allocate and register resources with the OS/2 Resource Manager.

Build Instruction

This sample is coded with the Watcom 11.0 compiler. You need to install Watcom 11.0 compiler and follow the OO Audio sample driver build instructions, which can be found from the Device Driver Online Documentation 'Using Your DDK Build Instructions', to set up your DDK build environment. After your build environment is set up, you can then invoke 'wmake' from the ddk\base\src\dev\mme\pciaudio directory to compile the sample driver.

Trademarks

The following are trademarks of International Business Machines Corporation in the United States, or other countries, or both:

  • IBM
  • OS/2

© Copyright International Business Machines Corporation 1999. All rights reserved.