In a world where hardware never seems to keep up with software, it is important to do everything you can to improve the performance of your application. This chapter will help you to optimize your applications for speed and reduce the memory and hard disk space required by your applications.
Optimization is the process of reviewing your operating environment, VBA code, application objects, and data structures to ensure that they are providing for optimum performance. In a nutshell, optimization is the process of making your application leaner and meaner.
Users become frustrated when an application runs slowly. In fact, if users are not warned about a slow process, they often reboot or shut down the power on the machine while a process is running. This can have dire results on the integrity of the data.
Many things can be done to optimize an application's performance—ranging from using a front-end tool such as the Performance Analyzer, to fastidiously adhering to certain coding techniques. The following is a highlight of all of the major things that you can do to optimize the performance of your applications.
The Access environment refers to the combination of hardware and software configurations under which Microsoft Access is running. These environmental settings can greatly affect the performance of an Access application.
The easiest way to improve the performance of an Access application is to upgrade the hardware and software configuration that it is running on. This form of optimization requires no direct intervention from the developer. A side benefit of most of the environmental changes you can make is that any improvements made to the environment will be beneficial to users in all of their Windows applications.
Improving the environment involves more than just adding some RAM. It can also mean optimally configuring the operating system and the Access application.
The bottom line is that Windows 95 and Access 95 both crave hardware—the more, the better. The faster your user's machines, and the more memory they have, the better. Additional hardware might not be the least expensive solution, but it certainly is the quickest and easiest thing that you can do to improve the performance of your application.
Memory is what Access craves most, whether you are running under the full version of Microsoft Access or using the runtime version of the product. Microsoft Access requires 12MB of RAM just to run under Windows 95, its standard operating environment. Although 12MB of RAM is required, 16MB of RAM is recommended by Microsoft. Under Windows NT, Access requires a minimum of 16MB of RAM. Both requirements can climb dramatically if your user is running other applications or if your application utilizes OLE automation to communicate with other applications. Put in a very straightforward way, the more RAM you and the users of your application have, the better. Having 32MB of RAM creates a great environment for Access 95. In fact, if every one of your users has at least 32MB of RAM, you can stop reading this chapter, because everything else covered here is going to provide you with minor benefits compared to adding more RAM. If you are like most of us, and not every one of your users has a Pentium 120 with 32MB of RAM, read on.
As your computer writes information to disk, it attempts to find contiguous space on disk within which to place data files. As the hard disk fills up, files are placed in fragmented pieces on the hard disk. Each time that your application attempts to read data and programs, it must locate the information scattered over the disk. This is a very time-consuming process. Therefore, it is helpful to defragment the hard disk on which the application and data tables are stored, using a utility such as the Disk Defragmenter that ships with Windows 95.
Just as the operating system fragments your files over time, Access itself introduces its own form of fragmentation. Each time that you add and modify data, your database grows. The problem is that when you delete data or objects within your database, your database does not shrink. Instead, Access leaves empty pages available in which new data will be placed. But these empty pages are not necessarily filled with data. The empty space can be freed using the Compact utility, which is part of the Microsoft Access software. The Compact utility frees excess space and attempts to make all data pages contiguous. You should compact your database frequently, especially if records or database objects (for example, forms and reports) are regularly added and deleted. The Compact utility can be accessed only when no database is open. To access the Compact utility select Tools|Database Utilities. From this menu you can find the Compact Database option.
Regardless of the compression utility that you are using, disk compression significantly degrades performance with Access 95. This fact is documented in the README file.
Although Windows 95 attempts to manage virtual memory on its own, you might find it useful to provide Windows 95 with some additional advice. To modify the size of the swap file, right-click on My Computer. Select Properties and then the Performance tab. Click the Virtual Memory button. It might be useful to change the size of the swap file, or move it to a faster disk drive or a drive that is connected to a separate controller card. Any changes that you make might adversely affect performance. It is important that you evaluate whether any changes you make will help the situation—or, perhaps, make things worse!
As covered in Chapter 18, "Developing for a Multiuser Environment," it is best to install both the Access software and your application objects on each user's local machine. Only the data tables should be stored on a network file server. Otherwise, you will be sending DLLs, OLE Objects, help files, type libraries, executables, and database objects all over the network wire. If you want to get the worst possible performance out of an Access application, install it on a diskless workstation with 12MB of RAM!
It always amuses me that the users with the slowest machines and the least memory have the most accessories running. These accessories include multimedia, fancy wallpaper, and other nifty utilities. If performance is a problem, you might try experimenting to see whether eliminating some of the frivolous niceties improves the performance of your application. If it does, encourage the user to eliminate the frills, get more memory, or accept your application's performance.
Another tip to make Windows 95 run faster is to shut down and restart on a regular basis. Memory tends to become fragmented and applications run more slowly. Although I can go weeks or months in Windows NT without rebooting, I find it beneficial to reboot my Windows 95 machine a couple of times a day.
In addition to the more obvious measures just outlined, some minor software tweaking can go a long way toward improving performance. Adjusting several settings in the Windows registry can dramatically improve performance. These changes all involve the registry's ISAM section. The properties that you might want to change include MaxBufferSize and ReadAheadPages. Both of these settings determine how the Jet engine utilizes memory.
MaxBufferSize controls the maximum size of the Jet engine's internal cache. By default, it is set to optimize performance on most machines. It does this by reading data in 2KB pages, placing the data in a memory cache. The data in the cache is readily available to forms, reports, tables, and queries. Lowering the value for MaxBufferSize frees memory for other tasks. This might be helpful on a machine with a minimum memory configuration.
ReadAheadPages controls the number of 2KB data pages that the Jet database engine reads ahead when performing sequential page reads. This number can range from 0 to 31, with the default at 16. The higher the number, the more efficient Access is at reading ahead so that data is available when you need it. The lower this number, the more memory is freed up for other tasks.
As you configure any of these settings, remember that what is good for one machine is not necessarily good for the next. The settings for each machine need to be optimized with its unique hardware configuration in mind.
Improvements have been made to the Jet 3.0 engine to dramatically improve performance over its predecessor versions. The Jet 3.0 engine is thoroughly 32-bit. It takes advantage of multiple execution threads, providing significant performance benefits. Statistics comparing Jet 3.0 with Jet 2.0 show that 60 percent of all operations run at least 20 percent faster on the Jet 3.0 engine. Multiuser performance is benchmarked up to six times faster!
Specific improvements to Jet 3.0 include faster delete operations, fewer read locks on index pages, and new mechanisms for page reuse and allocation. The Jet 3.0 engine also does implicit transaction processing. This means that, although many of us wrapped processing loops in the BeginTrans...CommitTrans construct in earlier versions of Access so that we could limit the number of disk writes, the Jet 3.0 engine handles this quite well on its own.
You can do many things to improve the performance of an application. Most of them require significant attention and expertise on your part. The Performance Analyzer is a tool that does some of the work for you. The Performance Analyzer is a tool that analyzes the design of an Access application. It suggests techniques that can be used to improve the application's performance. Many of the techniques that the Performance Analyzer suggests can be implemented automatically.
To use the Performance Analyzer, select Tools|Analyze|Performance. The dialog shown in Figure 23.1 appears.
Figure 23.1. The Performance Analyzer dialog.
Select the individual tables, queries, forms, reports, macros, modules, and relationships that you want the Performance Analyzer to scrutinize. After you click OK, the Performance Analyzer analyzes the selected objects. When it has completed the analysis process, the Performance Analyzer dialog appears (see Figure 23.2). The Performance Analyzer dialog provides you with a list of suggested improvements to the selected objects. The suggested improvements are broken down into recommendations, suggestions, ideas, and items that were automatically fixed. Suggested improvements include things such as the addition of an index or conversion of an OLE object. For example, after analyzing the NorthWind database that ships with Access, the performance analyzer suggested that the form called Customers should use a stored query as the row source for the control called Country. If you were to click on the suggestion and then click Optimize, Access would prompt you for a name of the query and then perform the change for you.
Figure 23.2. The Performance Analyzer dialog.
Now that you have seen the changes that you can make to your environment to improve performance, let's talk about the changes that you can make to your data structures to optimize performance. Such changes include eliminating redundant data, utilizing indexes, selecting appropriate field data types, and various query techniques.
Optimizing performance by tweaking the data structure is imperative for good performance. No matter what else you do to improve performance, poor data design can dramatically degrade the performance of your application. All other optimization attempts are futile without proper attention to this area.
Days and days can be spent optimizing your data. These changes must be well thought out and carefully analyzed. They are often made over time as problems are identified. Such changes can include those in the following sections.
In essence, this means you should normalize your tables. Data that appears in multiple places can significantly slow down your application. An example would be a company address appearing in both the customer table and the orders table. This information should be included only in the customer table. Queries should be used to combine the address and order data when needed.
When it comes to performance, unfortunately, there are no hard and fast rules. Although most of the time you gain performance by normalizing your data structure, there are times when denormalizing can help. This is generally the case when you forever find yourself creating a particular join over and over again. You can try denormalizing the data to see whether dramatic performance improvements result. Remember that denormalization has definite down sides regarding data integrity and maintenance.
It is amazing how far an index can go in improving performance. Fields on both sides of a join should be indexed. Any fields or combination of fields on which you search should also be included in an index. You should create indexes for all columns used in query joins, searches, and sorts. You should create Primary Key indexes rather than unique indexes, and unique indexes rather than nonunique indexes. The performance improvements rendered by indexes are profound.
When defining a field, select the shortest data type available for the storage of the data. For example, if you will be storing a code between 1 and 10 within the field, there is no reason to select double for a numeric field.
Optimizing your queries requires lots of practice and experimentation. For example, some queries involving a one-to-many relationship run more efficiently if the criteria is placed on the "one" side of the relationship. Others run more efficiently if the criteria is placed on the "many" side. Let's start with some basics that can go a long way toward improving the performance of your queries and your application as a whole:
Now that you have seen what you can do with the design of your queries to improve performance, take a look at a couple of simple techniques that you can employ to improve the performance of your queries.
A simple but often neglected method of optimizing queries is to deliver your queries compiled. A query compiles when you open it in Datasheet view and then simply close it. If you modify a query and then save it, it is not compiled until the query runs. Delivering precompiled queries ensures that they run as quickly as possible.
Finally, it is important that you compile your queries using the same amount of data that your application will contain, because Jet's query optimizer optimizes the query differently depending upon the amount of data that it finds. If you build a query using 100 records that will run on a live table containing 100,000 records, the query won't be properly optimized. You must rerun and resave your query using the correct quantity of data if you want the query to be properly optimized.
No matter what you do to optimize the operating-system environment and improve your data design, poor code can continue to bog you down. A properly optimized application is optimized in terms of the environment, data design, and code. Just as poor table design can degrade performance, poor coding techniques can also have a dramatic negative effect on performance. Changes to your code include the elimination of variants and dead code, the utilization of built-in collections, and the use of specific object types. An important code-related optimization is to deliver your modules precompiled.
The following changes and techniques can all aid in the improvement of performance. It is important to recognize that any one change won't make much of a difference. However, an accumulation of all of the changes, especially where code is being reexecuted in a loop, can make a significant impact on the performance of your application.
Variant variables are the slowest. They carry a lot of overhead because they are resolved at runtime. Remember that the following statement declares a variant type of variable:
To strong-type this variable as an integer, for example, your code must be modified to look like this:
Dim intCounter As Integer
Not only should you strong-type your variables, but you should also use the smallest data type possible. Remember that data types such as Boolean, byte, integer, and long are the smallest and therefore the fastest. These are followed by single, double, currency, and (finally) variant. Of course, if you must store very large numbers with decimal points into a variable, you cannot pick single. Just keep in mind that it is wise to select the smallest data type appropriate for the use of the variable.
Just as the general variant data type is inefficient, generic object variables are also inefficient. The MakeItBold subroutine uses a generic object variable:
Private Sub cmdMakeBold_Click() Call MakeItBold(Screen.PreviousControl) End Sub Sub MakeItBold(ctlAny As Control) ctlAny.FontBold = True End Sub
On the other hand, the SpecificBold subroutine uses a specific object variable:
Private Sub cmdSpecificBold_Click() Call SpecificBold(Screen.PreviousControl) End Sub Sub SpecificBold(txtAny As TextBox) txtAny.FontBold = True End Sub
The difference is that the SpecificBold routine expects to receive only text boxes. It does not need to resolve the type of object it receives and is therefore more efficient.
This code is contained with a database called CHAP23EX.MDB, found on your sample code CD-ROM. The example can be found in frmObjVar.
There is a tendency to call out to procedures for everything. This is good from a maintenance standpoint but not from an efficiency standpoint. Each time VBA calls out to a procedure, additional time is taken to locate and execute the procedure. This is particularly evident when the procedure is called numerous times. You need to decide how important maintainability is as compared to speed.
The following code is very inefficient:
If bFlag = True Then bFlag = False Else bFlag = True End If
It should be modified to look like this:
bFlag = Not bFlag
Besides requiring fewer lines of code, this expression evaluates much more quickly at runtime.
The built-in collections are there whether you use them or not. By using For Each...Next and a collection of objects, you can write very efficient code. The following is an example:
Sub FormCaption() Dim frm As Form For Each frm In Forms frm.Caption = frm.Caption & " - " & CurrentUser() Next End Sub
Here, you are using the forms collection to quickly and efficiently loop through each form, changing the caption on its title bar.
Using the Len function is more efficient than testing for a zero length string. Here are the options:
Sub SayNameZero(strName As String) If strName <> "" Then MsgBox strName End If End Sub
Sub SayNameLen(strName As String) If Len(strName) Then MsgBox strName End If End Sub
The second example is easier for VBA to evaluate and therefore runs more quickly and efficiently.
This example is very similar to the previous one. It is better to evaluate for True and False instead of zero. Here is an example:
Sub SaySalaryZero(lngSalary As Long) If lngSalary <> 0 Then MsgBox "Salary is " & lngSalary End If End Sub
The following code would run more efficiently:
Sub SaySalaryTrue(lngSalary As Long) If lngSalary Then MsgBox "Salary is " & lngSalary End If End Sub
In versions of Access prior to Access 95, transactions dramatically improved performance. Using explicit transactions, the data is written to disk only once, upon the CommitTrans. All changes between a BeginTrans and a CommitTrans are buffered in memory. Because disk access is the slowest thing on a computer, this technique offered you major performance benefits in versions of Access prior to Access 95. The difference with Access 95 is that the Jet 3.0 engine implicitly buffers transactions. Most of the time Jet's own transaction handling offers better performance than your own. At other times, you can improve on what Jet does on its own. The only way to know for sure is to do your own benchmarking. Each situation will be different.
As you modify your subroutines and functions, you often declare a variable and then never use it. Each Dim statement takes up memory whether you are using it or not. Furthermore, Declare statements, which are used to call external library functions, also take up memory and resources. They should be removed if they are not being used.
Most programmers experiment with various alternatives for accomplishing a task. This often involves creating numerous test subroutines and functions. The problem is that most people do not remove this code when they are done with it. This dead code is loaded with your application and therefore takes up memory and resources. Several third-party tools are available that can help you to find both dead code and variable declarations. One that many people use is called Total Access Analyzer by FMS, Inc. The Performance Analyzer, included as part of Access 95, can also assist you with the process of eliminating dead code and variables.
If you are going to repeatedly refer to an object, you should declare an object and refer to the object variable rather than the actual control. Here's an example:
Forms!frmAny!txtHello.FontBold = True Forms!frmAny!txtHello.Enabled = True Forms!frmAny!txtHello.Left = 1 Forms!frmAny!txtHello.Top = 1
This is a very scaled-down example, but if numerous properties are being changed, or if this code is being called recursively, an object variable can be used to make the code more efficient.
Private Sub cmdChangeObject_Click() Dim txt As TextBox Set txt = Forms!frmHello!txtHello1 txt.FontBold = True txt.Enabled = True txt.Left = 100 txt.Top = 100 End Sub
Another way to optimize the code in the previous example is to use a With...End With construct. The code would look like this:
Private Sub cmdChangeObjectWith_Click() With Forms!frmHello!txtHello2 .FontBold = True .Enabled = True .Left = 100 .Top = 100 End With End Sub
In the preceding example, you used Forms!frmHello!txtHello to refer to a control on the current form. It is more efficient to refer to the control as Me!txtHello because VBA searches only in the local name space. Although this makes your code more efficient, the downside is that the Me keyword only works within form modules. It does not work within code modules. This means that you cannot include the Me keyword in generic functions that are accessed by all of your forms.
Many of the functions come in two forms: one with a dollar sign ($) and one without. An example would be Left(sName) versus Left$(sName). It is more efficient to use the version with the dollar sign whenever possible. The functions with the dollar sign return strings rather than variants. When a string variable is returned, VBA does not need to perform type conversions.
Array elements take up memory regardless of whether they are being used. Therefore, it is sometimes preferable to use dynamic arrays. The size of a dynamic array can be increased as needed. If you want to reclaim the space used by all of the elements of the array, you can use the Erase keyword as follows:
If you want to reclaim some of the space being used by the array without destroying data in the elements that you want to retain, use Redim Preserve.
Redim Preserve aNames(5)
This statement sizes the array to six elements. (It is zero-based.) Data within those six elements is retained.
Constants improve both readability and performance. A constant's value is resolved upon compilation. The value that the constant represents is written to code. A normal variable has to be resolved as the code is running, because VBA needs to obtain the current value of the variable.
A bookmark provides you with the most rapid access to a record. If you are planning to return to a record, set a variable equal to that record's bookmark. It is very easy to return to that record at any time. Here's an example:
Sub BookMarkIt() Dim db As DATABASE Dim rst As Recordset Dim strBM As String Set db = CurrentDb() Set rst = db.OpenRecordset("tblProjects", dbOpenSnapshot) strBM = rst.Bookmark Do Until rst.EOF Debug.Print rst!ProjectID rst.MoveNext Loop rst.Bookmark = strBM Debug.Print rst!ProjectID End Sub
The preceding code is found in basOptimize of CHAP23EX.MDB. The bookmark is stored into a variable before the Do Until loop is executed. After the Do Until loop executes, the recordset's bookmark is set equal to the value contained within the string variable.
Object variables take up memory and associated resources. Their value should be set equal to Nothing when you are done using them. This conserves memory and resources.
Set oObj = Nothing
Besides being easier to code, it is much more efficient to execute a stored query than to loop through a recordset, performing some action on each record. Consider the following example:
Sub LoopThrough() Dim db As DATABASE Dim rst As Recordset Set db = CurrentDb() Set rst = db.OpenRecordset("tblProjects", dbOpenDynaset) Do Until rst.EOF rst.Edit rst!ProjectTotalEstimate = rst!ProjectTotalEstimate + 1 rst.UPDATE rst.MoveNext Loop End Sub
The preceding code, found in basOptimize of CHAP23EX.MDB, loops through a recordset, adding one to each project total estimate. Contrast this to the following code:
Sub ExecuteQuery() Dim db As DATABASE Set db = CurrentDb db.Execute "qryLowerEstimate" End Sub
This code executes a stored query called qryLowerEstimate. The query runs much more efficiently than the Do Until loop.
Applications run more slowly when they are not compiled. Forms and reports load more slowly, and the application requires more memory. If you deliver your application with all of the modules compiled, they do not need to be compiled on the user's machine before they are run.
To easily recompile all modules, select Run|Compile All Modules with the Module window active. This command opens and compiles all code in the application, including the code behind forms and reports. You must follow the Run|Compile All Modules command with the File|Save All Modules command. This preserves the compiled state of the application. If you compile all modules and do not save the compiled state, you might as well not have compiled in the first place.
Don't bother selecting the Run|Compile All Modules command if you plan to make additional changes to the application. An application becomes decompiled whenever the application's controls, forms, reports, or modules are modified. Even something as simple as adding a single control to a form will cause the application to lose its compiled state. Therefore, it is important to perform the Run|Compile All Modules command and the File|Save All Modules command immediately before you distribute the application.
VBA code can theoretically be placed in any module within your application. The problem is that a module is not loaded until a function within it is called. After a single procedure in a module is called, the entire module is loaded into memory. Furthermore, if a single variable within a module is used, the entire module is loaded into memory. As you might imagine, if you design your application without much thought, every module in your application will be loaded.
If you place similar routines all in one module, that module is loaded and others are not loaded. This means that if users are using only part of the functionality of your application, they will never be loading other code modules. This conserves memory and therefore serves to optimize your application.
Several things can be done to forms and reports to improve your application's performance. These include techniques to quickly load the forms and reports, tips and tricks regarding OLE objects, and special coding techniques that apply only to forms and reports.
Because forms are your main interface to your user, making them as efficient as possible can go a long way toward improving the user's perception of performance within your application. Additionally, many of the form techniques are extremely easy to implement.
Form-optimization techniques can be categorized in two ways: those that make the forms load more quickly, and those that enable you to more efficiently manipulate objects within the form.
The larger a form is and the more controls and objects that you have placed on it, the less efficient that form will be. Make sure that controls on the form do not overlap. It is also extremely beneficial to group form data onto logical pages. This is especially important if your users have insufficient video RAM. Objects on subsequent pages should not be populated until the user moves to that page.
Forms and their controls should be based upon saved queries. Include only fields required by the form in the form's underlying query. Avoid using Select * queries. Because Access is so efficient at internally optimizing the manipulation of query results, this improves the performance of your forms. To further take advantage of the power of queries, reduce the number of records that the query returns, loading only the records you need at a particular time.
OLE objects take far more resources than do images. If an OLE bitmapped object does not need to be changed, convert it to an image. To accomplish this, click on the object and select Format|Change To.
Avoid the use of subforms whenever possible. Access treats a subform as a separate form, so it takes up significant memory.
Make sure that the RowSource for a combo box includes only the columns needed for the combo box. Index on the first field that appears in the combo box. This has a dramatic effect on the speed at which a user can move to an element of the combo box. Also, whenever possible, make the first visible field of a combo box a text field. Access converts numeric fields to text as it searches through the combo box to find a matching value.
A general rule regarding the performance of forms is to place all database objects, except data, on each user's machine. This eliminates the need for Access to constantly pull object definitions over the network.
Another tip that can help you to dramatically improve the performance of your forms is to use the default formatting and properties for as many controls as possible. This acts to significantly improve performance because only the form and control properties that differ from the default properties are saved with the form.
Many of the report-optimization techniques are the same as the form-optimization techniques. Reducing the number of controls, avoiding overlapping controls, basing reports on queries, and avoiding OLE objects are all techniques that improve the performance of reports as well as forms.
A special technique that can be used to improve the performance of reports is new to Access 95. It involves the No Data event. This event is fired when a report is opened and no data is returned by the record source of the report.
To ensure that the Time and Billing Application is optimized, you can do several things:
The most attractive application can be extremely frustrating to use if its performance is less than acceptable. Because Access itself requires significant resources, you must take the responsibility of making your code as lean and efficient as possible.
This chapter focused on several techniques for improving performance. Probably one of the easiest ways to improve performance is by modifying the hardware and software environment within which Access operates. This chapter covered how techniques such as adding RAM, defragmenting a hard disk, and tuning virtual memory and other settings can act to dramatically improve the performance of your applications. This chapter also showed you how to elicit the aid of the Performance Analyzer to quickly and easily identify problem areas in your application. Finally, this chapter focused on data-design fundamentals, coding techniques, and form- and report-optimization techniques.
By following the guidelines covered in this chapter, you can help to ensure that you do not inadvertently introduce bottlenecks into your application. Although any one of the suggestions included in this chapter might not make a difference by itself, the combined effect of these performance enhancements can be quite dramatic.