Another place to look for DB2 performance results is to research the use of compression.
Using DB2 Compression
Researching another aspect of one of the main tables showed that the table was not leveraging DB2 compression. Since it has almost a million rows, a long record length and was probably referenced more for retrieval than update or insert activity, it could benefit tremendously through compression. Compression implemented at other clients has produced disk space savings of 50% to 90% and reduced overall CPU by referencing and caching more data through every I/O. Researching compression further showed over 130 tables that might be candidates for using DB2 compression.
To find your own DB2 compression candidates, use the catalog query below.
— Provides a list of tables that are candidates for DB2 compression — tables with row sizes > 75 — that have more than 500000 rows — SELECT DBNAME,TSNAME,PARTITION,CARD,PQTY*4/720 AS CYLS,STATSTIME FROM SYSIBM.SYSTABLEPART WHERE COMPRESS = ‘ ‘ AND CARD > 500000 AND DBNAME NOT LIKE ‘TMP%’ AND DBNAME NOT LIKE ‘WRK%’ ORDER BY 4 DESC; WITH UR; |
In this list of tables created for the client, I discovered a large portion of the client’s environment was compressed. The list produced shows tables already compressed with a “Y.” Look at these tables to make sure that they are still good candidates for DB2 compression to improve DB2 performance.
For example I had one situation where new requirements were put on a DB2 table and massive updates were new to the table. Remember to make sure the table is used mostly for read-only activity. Each DB2 individual table needs to be analyzed to determine its access (insert, update versus select) profile. The candidate tables found at the client took over a terabyte of disk space. Reducing it by 50% saved them considerable resources both in the CPU performance area and in the disaster recovery area. It also cancelled the new disk drive purchase plans saving the client even more money.
Summary
It’s worthwhile to run the query shown earlier. When you have the list of tables that are candidates for compression, you will need to research their access profiles. If the tables have significantly more SELECT queries then INSERT, UPDATE or DELETE queries, they are candidates for compression and saving your processing and company a lot of money as well as improving DB2 performance.
_________________________________________________________
Dave Beulke is an internationally recognized DB2 consultant, DB2 trainer and education instructor. Dave helps his clients improve their strategic direction, dramatically improve DB2 performance and reduce their CPU demand saving millions in their systems, databases and application areas within their mainframe, UNIX and Windows environments.
Leave a Reply