Technology and Community

Jim O'Neil

Subscribe to Jim O'Neil: eMailAlertsEmail Alerts
Get Jim O'Neil: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Open Web Magazine

Open Web: Article

Open Database Connectivity

The possibilities are limitless

With that, the external function declaration for use within PowerBuilder is:

function integer SQLDataSources(ulong henv, uint fDirection,
     REF string szDSN, int cbDSNMax, REF int pcbDSN,
     REF string szDescription, int cbDescriptionMax, REF int pcbDescription)
ALIAS FOR "SQLDataSourcesA" LIBRARY "ODBC32.DLL"

Note that we have provided an ALIAS clause because there are both ANSI and Unicode versions of API calls that involve string arguments. PowerBuilder 9 is not Unicode compliant, so we use the ANSI versions of the methods in this example.

Within n_tr we have defined a method called GetDSNList to invoke SQLDataSources and collect the DSNs and driver names in a DataStore object that is returned from the method. The w_main window invokes this method when it first opens to display the data within a tree view. The core of this processing is shown in Listing 1.

In this and the other methods we develop, a helper method HandleAPIError retrieves information on any errors that might occur while executing an ODBC API call. The SQLSTATE, error message, and error code are obtained via another ODBC API method, SQLError. You can view the implementation of that method via the source code provided with this example. 

Step 2: Populating the Table List
When the user of the sample application double-clicks on one of the ODBC DSNs listed in the left tree view, a connection is established with that data source through the familiar PowerScript CONNECT USING SQLCA; statement, with the selected label of the tree view providing the DSN portion of the ConnectString DBParm. If other pieces of information are required to connect, like a user ID or password, the driver-specific dialog will appear prompting the user for that information.

When PowerBuilder itself connects to an ODBC data source, it establishes both an environment and a connection handle. The rest of our application will leverage that connection handle, which is exposed to the PowerBuilder programmer via the DBHandle() method of the transaction object. This eliminates the need for our application to explicitly invoke the SQLAllocConnect and SQLFreeConnect methods.

Upon viewing the sample code, you'll notice that the n_tr object includes a private method, GetDataSourceInfo, which issues a number of SQLGetInfo calls to query and store driver-specific properties that affect our application. In our particular case we need to obtain the maximum lengths of catalog names, table names, schemas (users), and column names for later use in invoking other ODBC methods. These items are just a small subset of the information that PowerBuilder gathers upon connecting to a database. You can see their values exposed in the Driver Information keys within the PowerBuilder DataBase painter as shown in Figure 3.

To populate the table list in the right tree view, we call the SQLTables API on a new statement handle and provide null values for the catalog, schema, table, and table type parameters. The nulls are sentinel values indicating that all table objects should be returned. Had we been interested only in a specific subset of tables, or even just one specific table, we would have provided string arguments, possibly including wildcard characters, for the particular facets desired. In fact, this is precisely what PowerBuilder does when you specify a TableCriteria within the System tab of the Database Connection profile. In the API declaration below, the Table Name and Table Owner entries of the Table Criteria are passed into szTableName and szSchemaName arguments, while the Table Type checkboxes are coalesced into the szTableType string:

function integer SQLTables (ulong hstmt,
    string szCatalogName, integer cbCatalogName,
    string szSchemaName, integer cbSchemaName,
    string szTableName, integer cbTableName,
    string szTableType, integer cbTableType)
ALIAS FOR "SQLTablesA" LIBRARY "ODBC32.DLL"

SQLTables returns a result set similar to what you would expect after issuing a select statement in PowerBuilder. To store the results of that result set, we need to introduce two additional ODBC API calls:

  • SQLBindCol: Associates one column of a result set with a specific memory location in the client application
  • SQLFetch: Retrieves one row of the result set into the variables set up with SQLBindCol
SQLBindCol is invoked for every column of a result set for which we are interested in examining data. In this specific case, consultation with the documentation for SQLTables reveals that we need the first four columns of the result set, namely the catalog, schema, table names, and the table type. Since each of these is a string value, we must first allocate space for each variable by using the PowerScript Space method and pass to it the maximum length for the specific string as determined from a previous invocation of SQLGetInfo.

After the columns of the result set have been bound to specific variables in our client application, each execution of SQLFetch will populate these variables with the next row of the result set (which in our case is the next table object in the connected data source). Typically, a SQLFetch loop is written and terminates when the end of the result set has been reached (as noted by a return value of SQL_NO_DATA, with a value of 100) or an error has been detected. A sample of such an implementation is shown in Listing 2, where we populate another DataStore with the catalog, schema, table name, and table type of every table object within the connected data source.

The DataStore ldsTableList populated in Listing 2 is returned to the main window whenever a new connection to a DSN in the left tree view is established. Our implementation of the PopulateTableTree method in w_main sorts this DataStore by catalog, table type, schema, and table name and presents the information hierarchically in the right tree view.

Step 3: Populating the Column List
The last portion of our application provides the list of columns within the table selected from the tree view populated in Step 2. The overall logic here is very similar to obtaining all of the tables, but instead we use the SQLColumns method, and we must bind a number of columns to both string and numeric variables to obtain the information necessary to display the data types and lengths of the various columns.

Documentation from the SQLColumns API shows 18 individual pieces (columns) of information returned for each column within a given table, six of which were added with the ODBC 3.0 specification. Of those values, we are interested only in the six columns shown in Table 1.

Our display will only include the column name, vendor-specific data type name, and the declared length of the column; however, to determine these there is specific logic dependent upon both the DATA_TYPE and the NUM_PREC_RADIX columns of the result set. That logic is implemented in the w_main method PopulateColumnList based on the documentation provided in the description of the SQLColumns API in the Microsoft ODBC SDK documentation.

One of the most notable shortcomings of our implementation here is that the display may not actually reflect the column ordering within the underlying table. Although each column's ordinal position is provided in column 17 of the SQLColumns result set, that information is guaranteed to be available only from ODBC 3.0-compliant drivers, and we have set a constraint that we use only ODBC 2.0 constructs. As a result, our application reports only the columns in the order returned by SQLColumns; however, in a more robust application, we would conditionally determine whether the current data source supports that ordinal position column and adjust our implementation to use the correct ordering whenever available (see Listing 3).

You might also notice in the declaration of Local External Functions in n_tr that we have added additional SQLBindCol variants: SQLBindColInteger and SQLBindColLong. Like the SQLBindColString we used before, these methods are all aliases for the single method SQLBindCol. The overloaded variants are required in PowerBuilder due to the strong typing and lack of pointer variables. In the ODBC API declarations, the TargetValue argument in SQLBindCol has a type of SQLPOINTER, which is simply a synonym for a void pointer that can be cast to the appropriate C data type when needed. PowerBuilder, however, does not support casting, so we must specify whether that argument is a string, integer, or other type. These external function declarations therefore differ only by the type of that third argument, and we use the one that is appropriate for the specific data type of the column to be bound.

As in the other methods we've discussed, the DataStore populated by this code is passed back to the window for interpretation and display within a list view. In this case, we have elected to hide length information for numeric types that are not decimals, and we format the length of decimals as precision.scale using the various columns of the SQLColumns result set to determine the appropriate display values.

Conclusion
There you have it, a PowerBuilder client application that provides many of the same features as development tools like PowerBuilder. It's not difficult to see how this example could be extended even more to allow for the dynamic creation of SQL statements in a manner similar to what PowerBuilder provides in its database and DataWindow painters. The functionality of your PowerBuilder ODBC application is really only limited by your willingness to probe and experiment with the ODBC API.

More Stories By Jim O'Neil

Jim is a Technology Evangelist for Microsoft who covers the Northeast District, namely, New England and upstate New York. He is focused on engaging with the development community in the area through user groups, code camps, BarCamps, Microsoft-sponsored events, etc., and just in general serve as ambassador for Microsoft. Since 2009, Jim has been focusing on software development scenarios using cloud computing and Windows Azure. You can follow Jim on Twitter at @jimoneil

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.