Skip to main content



This page will give the developer a basic overview of the Wacom Feel™ Multi-Touch API framework, and details on writing a Multi-Touch application. All touch enabled Wacom tablets supported by the Wacom Tablet Driver are supported by this API. To find out if the Wacom Tablet Driver supports your tablet, please go to the Wacom tablet driver support page:


A Mac hardened runtime application wanting to access the Multi-Touch API must set the following entitlements:

Key                                                                                                                                   TypeValue
Entitlements FileDictionary(3 Items)
      App SandboxBooleanYES Items)
            Item 0Stringcom.wacom.CoordinatorMessageServer
            Item 1Stringcom.wacom.ProfessionalTouchDriverMessageServer Item)
             Item 0Stringcom.yourCompany.MultiTouch.WacomTouch


The WacomMultiTouch framework is installed during the installation of the Wacom Driver and should be included in the application. It is dynamically loaded at runtime by the your application and must not be installed by your application. There are two header files that need to be included in an application using this framework:

  • WacomMultiTouchTypes.h – Defines data structures and types.

  • WacomMultiTouch.h – Entry points of the API.

Programing Basics

The basic use model for processing touch point data is as follows:

  1. Your application must have a weak reference to the MultiTouchFramework so that when the application is loaded, the MultiTouchFramework will be loaded dynamically. If it does does not load dynamically, confirm you have the latest Wacom Tablet Driver installed.

  2. Initialize the framework to establish communication with the driver.

  3. Establish a device attach callback. This will immediately be a called for each multi-touch device on the system. This will also be called when the user attaches a new multi-touch device to the system. This callback includes a capabilities structure for each device. Using this data the application can see what types of data are supported and the data ranges.

  4. Establish a device detach callback. This will be called when a device is detached from the system. Use this callback to close any open data callbacks the application has created.

  5. Assuming a device is attached that supports touch point data, the application would establish a touch point data callback. This callback is triggered whenever a finger is detected on the device. Finger data includes position and state information. The application should either quickly process the data or buffer the data to a worker thread for further processing.

  6. When the application is done processing touch data it should close all the callbacks and call the quit function.

The Multi-Touch API does not provide or process pen data. If the application needs to process both pen and touch data then the application should monitor both APIs.

Overview of Multi-Touch Application

This section will provide detailed information about the recommended high-level structure of a Multi-Touch application.

Initialization Function

It is best to first call WacomMTInitialize. WacomMTInitialize will establish communication with the Wacom Feel™ Multi-Touch framework. If this succeeds then the entry point for all the other API functions are available. Then a call is made to the framework with the version of the API requested by the application. If the framework supports this version, it will report success. If any part of this function fails, the function returns the error code and the framework will not work.

Quit Function

If the framework was loaded, then it is necessary for your application to release any open resources. When exiting your application, a call to WacomMTQuit will take care of the cleanup.

Attached Device Array Function

Your application will need to know how many multi-touch supported devices are connected. Use WacomMTGetAttachedDeviceIDs for this purpose. This function will return an array of attached devices. This function takes in an array of integers. To use this function the application should call it with no buffer and no size. This will return the number of devices attached. The application will then create a buffer and recall the function. It should be noted that the number of devices could change between these two calls. The function will then again return the number of devices attached and provide as many of the device IDs as the buffer will hold. These device IDs will then be used to get the device capabilities. It is recommended that the application use the attached callback function.

Device Capabilities Function

Your application will probably want to know the device capabilities of the attached multi-touch devices. The WacomMTGetDeviceCapabilities function is used to get the device capabilities for a given device ID. If the application used the WacomMTGetAttachedDeviceIDs function to get the IDs, then you should call this function with each of the IDs to get a capabilities structure. These capabilities structures are cached by the application so it will know the scaling of the data provided by that device. An application can call this function when processing the data, but this will cause a lot of unneeded processing.

Attach and Detach Device Callback Functions

In order to receive tablet attach and detach events, use the WacomMTRegisterAttachCallback and WacomMTRegisterDetachCallback to register your callback functions.

The attach callback function is the best choice for receiving notification of change in device information. Not only does this function provide a capabilities structure for all the currently attached devices, but it will also monitor the system and inform the application when a new device is added. Along with the detach callback this is the recommended method to keep track of the multi-touch devices. The detach callback function will monitor and inform the application when a device is detached. While strictly speaking an application could ignore a detach, it is recommended that the application close any open callbacks (data read callbacks are covered in the Data Read Functions section, below). This will allow the application to respond to the attach function more quickly.

Data Read Functions

There are three different types of data that may be provided by any given device. They are finger, blob, and raw.

  1. The finger data is the most common and the most useful. All multi-touch devices at least support finger data. This data will track a touch point for each finger placed on the device. While that finger is down it will be assigned an ID. The ID represents a point of contact and not a specific finger. For example if the user places their index finger on the device it may be assigned ID 1. They then place their middle finger down it is assigned 2. The ring finger follows assigning it 3. The user then lifts the middle finger finishing the lifetime of the contact labeled 2. If the finger is returned to the device it may be given the ID 2 because it is not currently being used as a contact but it could also be given the ID 4. If both the middle and ring finger are removed and the ring finger returned to the device the ID will not be 3 as before but should be 2 or maybe 5.

  2. If the device supports blob data it will be indicated in the capabilities structure. Blob data represents an outline of the object or objects on the surface of the device. There are two types of blobs: the primary blob and void blobs. A blob is made from one primary and zero or more voids. The best way to think of a void is to imagine a doughnut. The outline of the outside of the doughnut is the primary blob and the hole is the void blob. Blobs are used when you want to perform an action on an irregular shape like a mask or smudge.

  3. If the device supports raw data it will be indicated in the capabilities structure. Raw data consists of the raw values from each point on the device.

There are two primary ways that an application can be notified of touch data availability: by callback, or by windowID.

The data callback function sets up a callback for a specific device. The callback is defined by the device for which it is created and the hit rectangle provided for that callback. This means that an application can create regions on a device. Most of the time an application will not need to segment the device, that is why the default (NULL) region is the entire touch device. The units for the hit rectangle vary depending on the device. For an opaque device like an Intuos or Bamboo tablet, the units are normalized (i.e. the origin of the device is 0,0 and the width is 1 and the height is 1). For an integrated device (Wacom Display) like a Cintiq Pro 24, the hit rectangle is in pixels. The device location in the virtual monitor world is known by the driver. These values will change when the monitor location or size is changed. Your application should monitor the system device change message.

For Mac, the API has extra data collection functions. These functions take in a windowID and instead of activating a callback, the data is sent to the application on the message thread of the provided window. The provided window is also used to create the hit rectangle for integrated devices. Special care should be used with these functions as improper data handling will interfere with regular message handling.

When the data function is created the application indicates what it is planning to do with the data by specifying what processing mode should be used (see WacomMTProcessingMode). There are three processing modes that an application can use:

  1. If the application is registered in observer mode (see WMTProcessingModeObserver in WacomMTProcessingMode), the data is sent both to the application and the system. This means that if the application performs actions based on the data the driver/OS does not know and will also act upon the data. The callbacks can also be created in consumer mode. An observer will get all data not captured by a consumer even when the application is in the background.

  2. If an application is registered in consumer mode (see WMTProcessingModeNone in WacomMTProcessingMode), the data is only sent to that application and any observers whose hitrect intersects with the consumer. The first consumer for that data is the only one to receive the data. The OS does not get the data. Simply put, for a consumer to get data, the application must be the front-most application. (ie, has keyboard focus).

  3. Finally, if an application has registered a hitrect in passthrough mode (see WMTProcessingModePassThrough in WacomMTProcessingMode), all data going to that hitrect is passed through to the OS. This is useful for carving out a section of your multi-touch application that can still respond to user touch input for program control (such as button presses, or an application tool palette). Note that passthrough mode can co-exist with either the consumer or observer modes described above. Touch data going to a passthrough hitrect region will – in addition to being processed by the OS – be processed by an observer, but not processed by a consumer. Thus if your application is primarily a consumer, and you register a small dialog as a passthrough hitrect, you can interact with and move that dialog around without sending data to the consumer below. If an observer needs to stop getting data it should unregister its callback functions or unregister its windowID.

Application Design Considerations

The Wacom Feel™ Multi-Touch API is a cross-OS platform data format that works with Mac and Windows. It is designed to work with all supported touch-enabled Wacom tablets. When designing software applications using the Wacom Feel™ Multi-Touch API you can be comfortable that your features should be consistent and compatible with future products from Wacom. Here are some design considerations:

  • Help your user keep their creative focus on their work. Seek to maximize fluidity of gestures for pan, rotate, zoom in-out, etc. within the user workspace.

  • Maximize the touchability of the user workspace. Use larger and well-spaced menus, controls, sliders, etc.

  • Enable non-dominant hand to control menus while using pen with the dominant hand. Allow menus to float or be near the non-dominant hand for simple two handed operations.

  • Bring controls to the pen and touch location, rather than menu strips.

Integrating Pen and Touch

One of the key benefits of using the Feel™ Multi-Touch API is to get fluid interaction with both pen and touch. Our recommendation is that touch gestures you implement (multi-finger commands such as zoom, pan, etc.) should not be interrupted when the pen is in range of the tablet. This allows for a more natural workflow as the user does not have to consciously raise the pen out of range before manipulating the application. Thus, arbitrating between pen and touch works best if you use pen pressure rather than the in range status of the pen. For example, continue handling touch input while pen pressure is zero. Once the pen touches the tablet, pressure is greater than zero. As long as the pen is hovering (zero pressure), the app can continue processing touch input.

When using simultaneous pen and touch it is best to pay attention to the confidence bit. This is particularly useful for ignoring data that comes from the hand holding the pen.

Display vs. Opaque Tablet Touch Behavior Considerations

The most important thing to consider when designing touch interactions for display tablets (Cintiq, etc.) and opaque tablets (Intuos, etc.) is that display tablets are direct touch and opaque tablets are indirect touch.

Indirect touch tablets operate like track pads. Touching an indirect device may move the cursor, but does not automatically generate a click at that location. As you move your finger, the cursor moves relative to its last location. You can reach any part of the desktop area when multiple monitors are attached. In this model, a gesture affects the object that is selected or the object that has focus. Because the tablet is relative and not mapped to any specific monitor the finger data is output in a unit natural percentage of the tablet.

Direct touch tablets move the cursor on a single monitor. There is a one-to-one relationship between where the user touches on the tablet surface and where the touch input is mapped to in their software environment. As soon as the user touches the tablet surface a click is generated at that location. Here a gesture affects the object under the fingers. Finger data from a direct touch device will be output to the monitor space that the tablet is mapped to.

The Feel™ Multi-Touch API provides information about the tablet providing touch data. An application should always check for and handle data from both direct tablets and indirect touch tablets. Using the information provided with the device capabilities, an application can translate the finger position data into the best units for the individual application.

Sample Code

There is macOS Multi-Touch sample code available. This application shows how to get finger point data from a device and paint these fingers on the client area. The examples also show how to setup a device for receiving touch callbacks or track a client through its WindowID. Included in the sample code is a demonstration of querying for the list of attached touch tablet devices and the touch properties of each tablet. Please see the sample code comments for detailed information on API usage.

See Also

Overview – An introduction to the Multi-Touch API

Reference – Complete API details

FAQs – Multi-Touch programming tips

Where to get help

If you have programming questions about the Multi-Touch API, please visit our Support page at: