Pyodbc meta-data methods you should be using – interactions with MySQL.

In my day job, I have recently begun to leverage the Python pyodbc package for MS Access related database tasks. Working with any database, it goes without saying that understanding the schema is paramount. What tables are present? What are their columns and types? How are they related? Among the many methods pyodbc provides, to answer these types of questions, it provides 3 ‘meta-data’ methods you simply cannot live without. Want to know which ones they are? Keep reading…

screen-of-computer-code
Photo by Markus Spiske on Unsplash

OS, Software, and DB used:

  • OpenSuse Leap 15.1
  • MySQL 8.0.18
  • pyodbc 4.0.27


Self-Promotion:

If you enjoy the content written here, by all means, share this blog and your favorite post(s) with others who may benefit from or like it as well. Since coffee is my favorite drink, you can even buy me one if you would like!


I’m quite fond – and comfortable – working in a command-line environment so the examples presented in the post are carried out using ipython, which can easily be installed via pip.

Connection handlers…

It only makes sense that to use pyodbc, it must first be imported. Also, assume for the duration of this post that I am working with the below database handlers – a.k.a connection(s) – and targeting the sakila open-source database:

1
2
3
4
5
6
In [1]: import pyodbc                                                                                                                                        

In [2]: conn = pyodbc.connect("DRIVER={MySQL ODBC 8.0 Unicode Driver};SERVER=localhost;DATBASE=sakila;USER=j2112o;PASSWORD=my_password_here;")                      

In [3]: cur = conn.cursor()                                                                                                                                  
In [4]: cur.execute("USE sakila;")

(Handy Tip: Use help(pyodbc) to explore a ton of information about pyodbc classes and objects.)

#1: Database Meta-data…

Knowing what tables are present – and available – for the currently logged-in user, is down-right important for sure. Here’s how you see those tables using pyodbc:

1
2
3
4
In [5]: show_tables = cur.tables()                                                                                                                            

In [6]: print(type(show_tables))                                                                                                                              
<class 'pyodbc.Cursor'>

Accessing the tables() method with the ‘cur’ variable and storing it in the variable, ‘show_tables’, returns another cursor object. Below is a portion of the tables() method section from the documentation:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
tables(...)
     |      C.tables(table=None, catalog=None, schema=None, tableType=None) --> self
     |      
     |      Executes SQLTables and creates a results set of tables defined in the data
     |      source.
     |      
     |      The table, catalog, and schema interpret the '_' and '%' characters as
     |      wildcards.  The escape character is driver specific, so use
     |      `Connection.searchescape`.
     |      
     |      Each row fetched has the following columns:
     |       0) table_cat: The catalog name.
     |       1) table_schem: The schema name.
     |       2) table_name: The table name.
     |       3) table_type: One of 'TABLE', 'VIEW', SYSTEM TABLE', 'GLOBAL TEMPORARY'
     |          '
LOCAL TEMPORARY', 'ALIAS', 'SYNONYM', or a data source-specific type name.

Iterating through a cursor object is quite simple using a for loop. Let’s see what data is bound to the ‘show_tables’ cursor:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
In [7]: for row in show_tables:
   ...:     print(row)
   ...:                                                                                                                                                      
('sakila', '', 'actor', 'TABLE', '')
('sakila', '', 'actor_info', 'VIEW', 'VIEW')
('sakila', '', 'address', 'TABLE', '')
('sakila', '', 'category', 'TABLE', '')
('sakila', '', 'city', 'TABLE', '')
('sakila', '', 'country', 'TABLE', '')
('sakila', '', 'customer', 'TABLE', '')
('sakila', '', 'customer_list', 'VIEW', 'VIEW')
('sakila', '', 'film', 'TABLE', '')
('sakila', '', 'film_actor', 'TABLE', '')
('sakila', '', 'film_category', 'TABLE', '')
('sakila', '', 'film_list', 'VIEW', 'VIEW')
('sakila', '', 'film_text', 'TABLE', '')
('sakila', '', 'inventory', 'TABLE', '')
('sakila', '', 'language', 'TABLE', '')
('sakila', '', 'nicer_but_slower_film_list', 'VIEW', 'VIEW')
('sakila', '', 'payment', 'TABLE', '')
('sakila', '', 'rental', 'TABLE', '')
('sakila', '', 'sales_by_film_category', 'VIEW', 'VIEW')
('sakila', '', 'sales_by_store', 'VIEW', 'VIEW')
('sakila', '', 'staff', 'TABLE', '')
('sakila', '', 'staff_list', 'VIEW', 'VIEW')
('sakila', '', 'store', 'TABLE', '')

As noted in the tables() documentation, each returned ‘row’ contains at the very least: table_cat, table_schem, table_name, and table_type.

Perhaps instead of all the returned values that tables() provides, you only need the table name. You can simply access that table_name value using dot (.) notation on the ‘row’ object:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
In [14]: for row in show_tables:
    ...:     print(row.table_name)
    ...:                                                                                                                                                      
actor
actor_info
address
category
city
country
customer
customer_list
film
film_actor
film_category
film_list
film_text
inventory
language
nicer_but_slower_film_list
payment
rental
sales_by_film_category
sales_by_store
staff
staff_list
store

Which turns out to be the pyodbc equivalent as this SQL executed in the MySQL client command-line:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
mysql> SHOW TABLES;
+----------------------------+
| Tables_in_sakila           |
+----------------------------+
| actor                      |
| actor_info                 |
| address                    |
| category                   |
| city                       |
| country                    |
| customer                   |
| customer_list              |
| film                       |
| film_actor                 |
| film_category              |
| film_list                  |
| film_text                  |
| inventory                  |
| language                   |
| nicer_but_slower_film_list |
| payment                    |
| rental                     |
| sales_by_film_category     |
| sales_by_store             |
| staff                      |
| staff_list                 |
| store                      |
+----------------------------+
23 rows in set (0.01 sec)

# 2: Table Descriptions…

I run this type of command so often, one would think I have most of the tables I work with memorized down to their structure:

1
2
3
4
5
6
7
8
9
10
mysql> DESC store;
+------------------+----------------------+------+-----+-------------------+-----------------------------------------------+
| Field            | Type                 | Null | Key | Default           | Extra                                         |
+------------------+----------------------+------+-----+-------------------+-----------------------------------------------+
| store_id         | tinyint(3) unsigned  | NO   | PRI | NULL              | auto_increment                                |
| manager_staff_id | tinyint(3) unsigned  | NO   | UNI | NULL              |                                               |
| address_id       | smallint(5) unsigned | NO   | MUL | NULL              |                                               |
| last_update      | timestamp            | NO   |     | CURRENT_TIMESTAMP | DEFAULT_GENERATED on update CURRENT_TIMESTAMP |
+------------------+----------------------+------+-----+-------------------+-----------------------------------------------+
4 rows in set (0.01 sec)

But, I don’t.
And, that’s okay. Thank goodness that, when working with pyodbc, you can retrieve this same information. Or just specific pieces of that meta-data if that is all you require.

Using the columns() method, you can get a tables’ description all the same. Here is the output for the ‘store’ table description with pyodbc:

1
2
3
4
5
6
7
8
9
In [23]: store_meta = cur.columns(table='store')                                                                                                              

In [24]: for row in store_meta:
    ...:     print(row)
    ...:                                                                                                                                                      
('', None, 'store', 'store_id', -6, 'tinyint unsigned', 3, 1, 0, 10, 1, '', '0', -6, None, None, 1, 'YES')
('', None, 'store', 'manager_staff_id', -6, 'tinyint unsigned', 3, 1, 0, 10, 0, '', '0', -6, None, None, 2, 'NO')
('', None, 'store', 'address_id', 5, 'smallint unsigned', 5, 2, 0, 10, 0, '', '0', 5, None, None, 3, 'NO')
('', None, 'store', 'last_update', 93, 'timestamp', 19, 16, 0, 10, 1, '', None, 9, 93, None, 4, 'YES')

And, all those values mean what? Lucky for us, the documentation is there:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
columns(...)
     |      C.columns(table=None, catalog=None, schema=None, column=None)
     |      
     |      Creates a results set of column names in specified tables by executing the ODBC SQLColumns function.
     |      Each row fetched has the following columns:
     |        0) table_cat
     |        1) table_schem
     |        2) table_name
     |        3) column_name
     |        4) data_type
     |        5) type_name
     |        6) column_size
     |        7) buffer_length
     |        8) decimal_digits
     |        9) num_prec_radix
     |       10) nullable
     |       11) remarks
     |       12) column_def
     |       13) sql_data_type
     |       14) sql_datetime_sub
     |       15) char_octet_length
     |       16) ordinal_position
     |       17) is_nullable

You can even fine-tune just what values you want and filter out all the others. Similar to the tables method, you can access the columns through dot (.) notation. Let’s look at just the column_name and data_type:

1
2
3
4
5
6
7
8
9
In [29]: store_meta = cur.columns(table='store')                                                                                                              

In [30]: for row in store_meta:
    ...:     print(row.column_name, row.type_name)
    ...:                                                                                                                                                      
store_id tinyint unsigned
manager_staff_id tinyint unsigned
address_id smallint unsigned
last_update timestamp

# 3: Referential Integrity…

Guess what? I have been pulling your leg. This post does not cover 3 pyodbc ‘meta-data’ methods you need to know about. It covers 4!

We all know how important FOREIGN KEYS are in table relationships right? If questions like 1) “Do any foreign keys exist in the schema?” or 2) “If so, what PRIMARY KEYS do they point to?” cross your mind, pyodbc can access this vital information all the same.

How you might ask? By leveraging the foreignKeys() and primaryKeys() methods

Just like other cursor objects, you can iterate through the results of a call to foreignKeys(). Targeting the ‘store’ table, we can see what other tables reference the ‘store_id’ PRIMARY KEY:

1
2
3
4
5
6
7
8
In [6]: store_fk_cur = cur.foreignKeys(table='store')

In [7]: for row in store_fk_cur:
   ...:     print(row)
   ...:                                                                                                                                                      
('sakila', None, 'store', 'store_id', 'sakila', None, 'customer', 'store_id', 1, 0, 3, 'fk_customer_store', 'PRIMARY', 7)
('sakila', None, 'store', 'store_id', 'sakila', None, 'inventory', 'store_id', 1, 0, 3, 'fk_inventory_store', 'PRIMARY', 7)
('sakila', None, 'store', 'store_id', 'sakila', None, 'staff', 'store_id', 1, 0, 3, 'fk_staff_store', 'PRIMARY', 7)

Once again, the documentation provides clarity on the various column meanings as shown in this partial portion:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Each row fetched has the following columns:
     |        0) pktable_cat
     |        1) pktable_schem
     |        2) pktable_name
     |        3) pkcolumn_name
     |        4) fktable_cat
     |        5) fktable_schem
     |        6) fktable_name
     |        7) fkcolumn_name
     |        8) key_seq
     |        9) update_rule
     |       10) delete_rule
     |       11) fk_name
     |       12) pk_name
     |       13) deferrability

Using the same ‘store’ table with the primaryKeys() method returns a meaningful row of data as well:

1
2
3
4
5
6
In [10]: store_pk_cur = cur.primaryKeys(table='store')                                                                                                        

In [11]: for row in store_pk_cur:
    ...:     print(row)
    ...:                                                                                                                                                      
(None, None, 'store', 'store_id', 1, 'PRIMARY')

And here is a section from the primaryKeys() method, with the meaning of the returned column values:

1
2
3
4
5
6
7
8
9
10
11
12
|  primaryKeys(...)
     |      C.primaryKeys(table, catalog=None, schema=None) --> self
     |      
     |      Creates a results set of column names that make up the primary key for a table
     |      by executing the SQLPrimaryKeys function.
     |      Each row fetched has the following columns:
     |       0) table_cat
     |       1) table_schem
     |       2) table_name
     |       3) column_name
     |       4) key_seq
     |       5) pk_name


I recently wrote, Using the REFERENTIAL_CONSTRAINTS table in MySQL – Foreign Key Awareness, that is along the same lines as this section of the post, and you can read it for more information regarding this handy meta-data table present in MySQL.


There you have it. The 4 pyodbc ‘meta-data’ methods you cannot live without. Try using them to answer these types of ‘meta-data’ questions the next time you have to interact with a supported database using pyodbc.

I’d love to know your thoughts about the post in the comment section below so please, comment freely and thanks for reading!

Like what you have read? See anything incorrect? Please comment below and thanks for reading!!!

A Call To Action!

Thank you for taking the time to read this post. I truly hope you discovered something interesting and enlightening. Please share your findings here, with someone else you know who would get the same value out of it as well.

Visit the Portfolio-Projects page to see blog post/technical writing I have completed for clients.

Have I mentioned how much I love a cup of coffee?!?!

To receive email notifications (Never Spam) from this blog (“Digital Owl’s Prose”) for the latest blog posts as they are published, please subscribe (of your own volition) by clicking the ‘Click To Subscribe!’ button in the sidebar on the homepage! (Feel free at any time to review the Digital Owl’s Prose Privacy Policy Page for any questions you may have about: email updates, opt-in, opt-out, contact forms, etc…)

Be sure and visit the “Best Of” page for a collection of my best blog posts.


Josh Otwell has a passion to study and grow as a SQL Developer and blogger. Other favorite activities find him with his nose buried in a good book, article, or the Linux command line. Among those, he shares a love of tabletop RPG games, reading fantasy novels, and spending time with his wife and two daughters.

Disclaimer: The examples presented in this post are hypothetical ideas of how to achieve similar types of results. They are not the utmost best solution(s). The majority, if not all, of the examples provided, is performed on a personal development/learning workstation-environment and should not be considered production quality or ready. Your particular goals and needs may vary. Use those practices that best benefit your needs and goals. Opinions are my own.

Hey thanks for commenting! Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.