6/11/10

Microsoft Access Performance Tips to Speed up Your Access Databases

Microsoft Access Table Performance Tips

Normalize Your Tables

Normalize your data so the same data is stored only in one place. Remember that time is also a dimension and you may need to store historic data as well. Read our paper on Data Normalization Fundamentals for more detail.

Every Table Should Have a Primary Key

Every table in your database should have a primary key. This allows your database application to quickly locate specific records. Additionally, you cannot create secondary indexes on a table's fields unless that table has a Primary Key.

Primary Key Should be One Field and Numeric

The primary key should only be one field and ideally numeric and meaningless. Primary keys define the uniqueness of each record which can be accomplished efficiently with a single number. The easiest is using an AutoNumber field in Access or Identity column in SQL Server. The primary key is also used in each secondary index for that table, so the smaller the better. Multi-field primary keys and non-numeric text fields are less desirable.

That said, some tables should simply use text fields as primary keys because they don't change much and the tables are relatively small. For instance, a list of countries or states (no need to create a separate number for each country or state).

Having a meaningless primary key means the index is stable even when data changes. Otherwise, changes in the primary key have a ripple effect through each secondary index and any other tables bound by referential integrity.

Tables Should Participate in Relationships

Related tables with one-to-one or one-to-many relationships should implement referential integrity with cascading deletes and/or updates to ensure orphan records are not created. With cascading deletes, the removal of the parent record, automatically deletes the child records in the corresponding table. Access (the Jet Engine) automatically takes care of this for you without the need to write any code, which is great for maintaining data integrity.

With a relationship established, a hidden index is created for the foreign key, so links between those tables are faster in queries. The link is also automatically drawn for you when you put the two tables on your query designer.

Eliminate Unnecessary Subdatasheets

By default, Access creates subdatasheet views between related tables. This is nice if you want it, but is a huge performance hit every time you open the table. Set this property to None if you don't want it. Here's more information on the Subdatasheet Name Property.

Choose the Optimal Data Types

Choose the best data types for your fields. By choosing the optimal data type, you can decrease both the disk space used to store data, and the time it takes Access to retrieve, manipulate, and write data. The general guideline is to choose the smallest data type possible to store a particular type of data.

Add Secondary Indexes As Necessary

If your tables are large and you search on a field or use it in a join, create a secondary index on the field(s). Secondary Indexes offer performance gains on an order of magnitude.

Don't Over-Index

Just as it is important to add indexes to fields that need it, it is important to avoid indexing fields that don't need it. Every index adds to the time it takes the database engine to update, delete and add records.

Don't Index Fields with Lots of Identical Data

Don't apply indexes to fields that contain much the same data. For example, indexing a Yes/No field is almost always a performance degrading operation. Similarly, if you have a number field that only contains two or three values, an index wouldn't be a good idea. To check the number of unique entries in an index, use the Access DistinctCount property. Compare this value to the number of records in the table and you can quickly see if the index is doing you any good.

Keep Indexes As Small As Possible

When creating a multi-field index, index only as many fields as are absolutely necessary.



Linked Databases and Linked Tables

Keep an Open Handle to Every Linked Database

You can significantly increase the performance of opening tables, forms, and reports that are based on linked tables by having Microsoft Access keep the linked table's database open. To do this, open a Database variable in Visual Basic code using the OpenDatabase method. Keep this variable open as long as your application is running. This forces Access to keep that database open, making access to linked tables much faster.

For more information, read our tip on Increasing the Performance of Linked Databases

Minimize Record Navigation

Avoid record navigation wherever possible on linked tables. Only use the PageUp and PageDown movements, and the Move last movements when absolutely necessary.

Use DataEntry Mode

If you are only going to be adding records, use the DataEntry command on the Records menu. This data access method is more efficient for adding records because existing records are not read from the database.

Create a Data Entry Form

If you often have the need to add new records to a linked table, consider creating an "Add Records" form and set that form's DataEntry property to Yes. This prevents Access from attempting to retrieve all the existing records in a linked table when you need is to enter new records.

Limit the Data Returned by Queries

Limit the number of fields and records returned by using filters or queries. This reduces the amount of data that needs to be read from the linked table, thereby according faster performance.

Don't Use Domain Aggregate Functions

If a query is based on one or more linked tables, avoid using functions (built-in or user-defined), or domain aggregate functions in the query criteria. When you use domain aggregate functions (such as DLookup), Access must fetch all records in the function's data source to execute the query.

Release Locks ASAP

To improve multi-user concurrency, assume that other users will be trying to edit data in the same linked tables that you are using. In other words, keep records locked only as long as is necessary.



SQL Server and ODBC Performance Tips

Link ODBC Tables

If you are going to be accessing a SQL database table, link the table permanently. This makes opening that table much faster in subsequent attempts. This is because linked tables cache a lot of information about the source table in your database, making the retrieval of the same structural information unnecessary after the linked table is created.

Minimize Server Traffic

Minimize the amount of data returned from the server. Do this by structuring your queries to return only the fields and records needed. Reduce the number of fields returned, and put constraints on the query by using the WHERE clause.

Use Snapshots When Appropriate

Don't open Dynaset type recordset object on SQL database tables unless you need to add or edit records, or need to see the changes made by other users. Instead, consider using Snapshot recordsets which can be faster to scroll through. Of course, Snapshot recordsets can take longer to open since they require a full read of the source data.

Use Dynasets for Large Record Sets

If you need to retrieve a large number of records, use a Dynaset instead of a Snapshot. Snapshot type recordsets must load all records from the data source before becoming available, whereas Dynasets are available as soon as the first 20 or so records are retrieved. Also, when using a Snapshot against large ODBC data sources, you run the risk of running out of disk space on your local computer. This is because all data is downloaded into RAM until the amount of RAM is exhausted. Then, the database engine creates a temporary database to store the contents of the snapshot. In a nutshell, when you open a snapshot, you need at least as much disk space as the largest table you are opening.

Take Advantage of the Cache

Use cache memory wherever possible when using external SQL data sources. Microsoft Access forms and reports have automatic caching mechanisms. When using recordsets in your Visual Basic code, use the CacheStart, CacheEnd and FillCache methods to maximize cache effectiveness.

Don't Force Local Query Processing

Don't use query constructs that cause processing to be done by Access on the local computer. The following query operations force the Jet database engine to perform local data processing:

  • Join operations between table that are linked to different data source (I.e. a join between a SQL table and a linked Access table)
  • Join operations based on query that uses the DISTINCT keyword, or a query that contains a GROUP BY clause.
  • Outer joins that contain syntax that is not directly supported by the remote database server.
  • The LIKE operator used with Text or Memo fields
  • Multi-level grouping and totaling operations
  • GROUP BY clauses that are based on a query with the DISTINCT keyword, or the GROUP BY clause.
  • Crosstab queries that have more than one aggregate, or that have field, row, or column headings that contain aggregates, or that have an ORDER by clause
  • User-defined functions, or functions that are not supported by the remote server
  • Complex combinations of INNER JOIN, LEFT JOIN, or RIGHT JOIN operations in nested queries.

Use FailOnError For Bulk Updates

If you are using bulk update queries, optimize performance on the remote server by setting the FailOnError property of the Querydef object, or query to Yes.

Use ODBCDirect

ODBCDirect gives you almost direct access to server data through ODBC. In many cases, it is a faster and more flexible way to hit server data than that traditional Jet/Linked table technique.



Microsoft Access Database Performance Tips

Split Your Database into Two Databases

You can improve performance and application maintainability by splitting your Access database into two databases. The "application" part of your database holds all objects except tables, and is linked to the "data" part that contains the actual tables. For more information, read our technical paper Splitting Microsoft Access Databases to Improve Performance and Simplify Maintainability

Use a Current Workgroup Information File

If you are using a workgroup information file (SYSTEM.MDA) created with a previous version of MS Access, convert it to the current version of Access for optimum performance.

Use the Access Performance Analyzer

Microsoft Access has a useful performance tool built right into the product. From the Tools menu, select Analyze, Performance. The Performance Analyzer allows you to select all objects or specific objects, and then runs an analysis looking for potential problems. The Performance Analyzer does not find all of the items that Total Access Analyzer does, but it does offer some tips.

Run the Access Table Analyzer

The Access Table Analyzer makes it easy to properly normalize the data in your tables by breaking tables with repeated or improperly structured data into two or more tables. This tool is available from the Access Tools, Analyze menu

Reduce the Size of Graphics in your Access 2007 Databases

If you embed graphics on your forms or reports, Access 2007 can store them much more efficiently. Access 2007 can convert graphics into much smaller PNG formats to significantly reduce the size of your databases. This does not affect graphics already on your forms and reports but helps if you add new graphics or replace existing ones. To activate this feature, change an Access setting. From the Access Office button, choose Access Options, Current Database. At the bottom of the Application Options section, set the Picture Property Storage Format to: Preserve source image format (smaller file size)

Compact Your Database Often To Reclaim Space

Compacting your database reclaims unused space and makes almost all operations faster. You should do this on a regular basis. Also, be sure to compact anytime you import or delete objects in your database, or compile and save VBA code.

Learn more about Total Visual Agent for a system administrative tool to schedule compact and backup your Access databases on a regular schedule.

Make It Look Faster

If you have exhausted all other performance optimization techniques, consider making your application "look" faster. Do this by displaying status messages and progress meters as your application loads forms, runs queries, and performs any other operation that may take a bit of time. While this doesn't make your application run faster, it appears to run faster.



Microsoft Access Query Optimization

Compact Often to Update Statistics

Compact the database often. When you compact the database, you reorganize records so that they are stored in adjacent spaces, making retrieval faster. Additionally, compacting a database updates its data statistics, which can be used by a query to run faster. You can force a query to recompile (which in turn causes it to use the latest statistics) by opening it in design view, saving it, and then running it.

You may want to defragment your disk using a program such as the Disk Defragmenter that is part of Windows before compacting your database. This leaves contiguous free disk space immediately after the database file. In theory, this make future additions to the database occur faster. You may want to experiment with this on your system.

Index Query Criteria Fields and Join Fields

Index any fields in the query that are used to set criteria. Index the fields on both sides of a join. Alternatively, you can create a relationship between joined fields, in which case an index is automatically created.

Search Access Help for: Index

Use Identical or Compatible Datatype In Join Fields

Fields that are joined in query should have the same data type, or compatible data types. For example, the Long Integer DataType is compatible with the AutoNumber data type.

Limit Fields Returned by a Query

Where possible, limit the number of fields returned by a query. This results in faster performance and reduced resource usage.

Avoid Calculated Fields and IIF Functions

Avoid calculated fields, or fields that use expressions in subqueries. Pay special care to avoid the use of immediate If (IIF) functions in sub-queries.

Don't Use Non-Indexed Fields for Criteria

Avoid using non-indexed fields or calculated fields for criteria restrictions.

Index Sort Fields

Index any fields you use for sorting. Be careful not to over-index.

Use Temporary Tables to Eliminate Running the Same Queries Over and Over

If you are processing data that's used multiple times (for instance on multiple reports), it may be faster to store intermediate results in temporary tables rather than running a series of Select queries many times. Create a temporary table to store your results. Empty the table and fill it with your data using and Append query. You can then use that table for multiple reports and forms.

Avoid Domain Aggregate Functions on Foreign Tables

Do not use domain aggregate functions (DLookup for example) in queries to access data from a table that is not in the query. Link to the table and set criteria accordingly, or create a separate aggregate (totals) query.

Use Fixed Column Headings in Crosstab Queries

Wherever possible, use fixed column headings in your crosstab queries with the PIVOT syntax.

Use BETWEEN Rather than >= and <=

Between lets the search engine look for values in one evaluation rather than two.

Use Count (*) To Count Records

If you use the Count function to calculate the number of records returned by a query, use the syntax Count(*) instead of Count([fieldname]). The Count(*) form is faster because it doesn't have to check for Null values in the specified field and won't skip records that are null.

Compile Each Query Before Delivering Your Application

When you compact your database, its data statistics are updated. When you then run a query, these updated statistics are compiled in the query's execution plan. This sequence of events results in the fastest possible query. Before you deliver your application, compact the database, and then force each query to be recompiled. You can force a query to recompile (which in turn causes it to use the latest statistics) by opening it in design view, saving it, and then running it.

Take Advantage of Rushmore Query Optimization

Microsoft Jet uses Rushmore query optimization whenever possible. Rushmore is applied to queries run on native Access data, and on linked FoxPro and dBASE tables. Other linked data types do not support Rushmore optimization. To ensure that Rushmore optimizations are used, create indexes on all fields that are used to restrict a query's output. If you have queries that don't contain indexes on fields used in the query's restriction clause, Rushmore is not used.

Link on Primary Key Indexes Whenever Possible

To make queries run faster, you should have indexes on all fields in the query that join, restrict, or sort the data. Whenever possible, link on Primary Key fields instead of other fields. Indexes are most critical on tables with large numbers of records, so you may not see a difference on small tables. You also don't need to add secondary indexes on fields that are part of referential integrity.

Experiment With One-to-Many Restrictions

If you have a one to many join in query with a restriction, try moving the restriction to the other side of the join. For example, if the restriction is on the many side, move it to the one side. Compare performance results for both versions, and choose the fastest one.

De-Normalize If Necessary

Although you should strive to normalize your data for best performance and future flexibility, you may want to consider denormalizing some of your data if you frequently run queries with joins would benefit from such data restructuring.

Experiment With Sub Queries Instead Of Joins

If you have a query with a join that is not performing adequately, consider replacing the join with a sub query. In some cases, the sub query may cause the overall query operation to run faster.

Limit the Number of Fields Returned By Each Query

Where possible, queries should use a Where clause to constrain, or limit, the number of records returned. This results in faster performance and reduced resource usage.



Microsoft Access Form Optimization

Save the SQL of the Form RecordSource as a Query

We've seen situations where a saved query loads significantly faster than the same SQL string stored as the RecordSource of a form. Somehow, saved queries are optimized more than the SQL string behind the report.

Close Unused Forms

Close forms that aren't being used. Every form that is open consumes memory that could be used by other parts of your applications.

Open Forms Hidden

Consider opening your application's most commonly used forms when your application starts. Set their Visible properties to False, and then make the Visible as needed. This frontloads some performance hits to the application load event, making forms load faster when needed.

Use the DataEntry Property of a Form

If a form's record source (the table or tables accessed by the form's RecordSource property) contain a large number of records, and the form is primarily used to add new records, set the DataEntry property of the form to Yes. This precludes Access from having to retrieve existing records when the form loads.

Don't Sort A Form's Recordset

Avoid sorting records in a form's underlying record source unless a particular presentation order is absolutely necessary for the form. This makes the form load faster.

Base Forms on Queries-Minimize Fields Returned

Base forms and subforms on queries rather than tables. By doing this, you can use the query to restrict the number of fields returned, making the form load faster.

Use Lightweight Forms

Consider replacing Visual Basic code in a form's module with calls to standard modules, or with hyperlink objects. Then set the form's HasModule property to False. This turns the form into a Lightweight form, making it load faster. Search Access online help for "Lightweight Forms" for more information. In Access 2007, you can use embedded macros for simple operations.

Index Fields Used to Link SubForms to a Form

Index all fields in the subform that are linked to the main form. Also index all fields in the subform that are used for criteria.

Set Editing Properties on SubForms

Set the subform's AllowEdits, AllowAdditions, and AllowDeletions properties to No if the records in the subform aren't going to be edited. Or set the RecordsetType property of the subform to Snapshot.

Reduce the Number of Fields in ListBox and ComboBox Row Sources

In the RowSource property of listbox and combobox controls, include only the fields that are necessary.

Set AutoExpand on ComboBoxes to No

Set the AutoExpand property of comboboxes to No if you don't need the "fill in as you type" feature.

First Field of an AutoExpand ComboBox Should Be Text

In a combobox that has the AutoExpand property set to Yes, the first displayed field should be a Text data type instead of a Number data type. In order to find matches, Access needs to convert numeric values to text. If the data type is Text, this conversion can be skipped.

Optimize Bound ComboBoxes

If the bound field in a lookup combobox is not the displayed field, don't use expressions for the bound field or the displayed field, don't use restrictions (the WHERE clause) in the row source, and use single-table row sources wherever possible.

Move Linked Data Local for ComboBox and ListBox Controls

If the data that fills a list box or combo box does not change often, and that data comes from a linked table, consider moving that data's table into the local database. This can be a huge performance boost, especially if the linked table is located on a network drive.

Group Controls On Multiple Pages

Consider grouping controls on multiple pages. When the form loads, prepare only the controls on the form's first page. Defer operations on other page's controls, such as setting the record source until the user moves to that page. This makes the form load faster.

Only Load Subforms on Tab Pages when the Page is Selected

If a tab control contains several pages (tabs) with subforms on them, the form will load quicker if the subforms on the tabs that aren't visible aren't loaded right away. Since those tab pages aren't viewed yet, you can defer loading the data until the user clicks on the tab. You'll take a performance hit the first time the user clicks on it, but if they never select the tab, you would never load the data. For more information, read Microsoft Access Performance Tip: Using Late Binding for Subforms on Tab Controls

Close Forms That Contain Unbound OLE Objects

Close forms that contain unbound OLE Objects when they are not in use. When you activate an unbound OLE objects, the memory used in that operation is not released until the form is closed.

Convert Subforms to Listbox or Combobox Controls

Where possible, convert subforms to listbox or combobox controls. It is far quicker to load a control than it is to load an additional form as a subform.

Move Form Module Code to a Standard Module

You can reduce a form's load time by moving its code from the form module to a standard module. When the form loads, the form's module doesn't need to be loaded. Of course, the standard module needs to be loaded at some point, but once a standard module is loaded, it stays in memory until you close the database.

Avoid Unnecessary Property Assignments

Set only the properties that absolutely need to be set. Properties assignments can be relatively expensive in terms of performance. Review your form's startup code to ensure that you are not setting any form or control properties that don't need to be set.

Use the Requery Method Instead of the Requery Action

Use the Requery method instead of the Requery action. The method is significantly faster than the action.

Give Visual Feedback

Give the user some visual feedback during long operations. Consider using status meters to display a task's progress. At a minimum, use the Hourglass cursor along with a status message.

Keep Forms Lightweight With Hyperlinks

Hyperlinks in label controls make it easy to open another Access object. So instead of placing command buttons on your forms to do common operations, investigate the possibility of using a label control with the Hyperlink properties. This approach eliminates the need for a command button, and its associated event code.

Split Forms Into Multiple Pages

Consider using multi-page forms, separated by the page-break character. This allows you to present only the controls needed, and can reduce form-load time. For example, if your form has 10 combobox controls that take a long time to fill, split the form into multiple pages using the PageBreak control. Then, pick the 5 combobox controls the user is most likely to use and place them on the first page. Place the remaining controls on the second page. Load time for the form should be substantially reduced, especially if the queries filling those combo box controls are complex.

Minimize the Number of Controls

Minimize the number of controls on your form. Loading controls is the biggest performance hit when loading a form.

Avoid Overlapping Controls

Avoid overlapping controls. It takes Access more time to render and draw controls that overlap each other than it does non-overlapping controls.

Use Graphics Sparingly

Use bitmap and other graphic objects sparingly as they can take more time to load and display than other controls.

Use the Image Control

Use the Image control instead of unbound object frames to display bitmaps. The Image control is a faster and more efficient control type for graphic images.



Microsoft Access Report Optimizations

Save the SQL of the Report RecordSource as a Query

We've seen situations where a saved query loads significantly faster than the same SQL string stored as the RecordSource of a report. Somehow, saved queries are optimized more than the SQL string behind the report.

Don't Sort Report Queries

Don't base reports on queries that use an ORDER BY clause. Access reports use their Sorting and Grouping settings to sort and group records: the sort order of the underlying record set is ignored.

Avoid Expressions and Functions In Sorting and Grouping

Try to avoid reports that sort or group on expressions or functions.

Index Fields Used In Sorting and Grouping

Index any fields that are used for sorting or grouping.

Base Reports on Queries-Minimize Fields Returned

Base reports and subreports on queries instead of tables. By using a query, you can restrict the number of fields returned to the absolute minimum number, making data retrieval faster.

Index Fields Used to Link Subreports

Index all the fields used to link a subreport to a report.

Index Subreport Fields Used for Criteria

Index all subreport fields used for criteria. This will cause the subreport to link its records faster. Of course, remember that over-indexing can cause performance bottlenecks when editing, adding and deleting data.

Avoid Domain Aggregate Functions in a Report's RecordSource

Do not use domain aggregate functions (such as DLookup) in a report's recordsource property. This can have a significant performance impact on how long it takes the report to open and display pages.

Use the NoData Event

Use the report's NoData event to identify empty reports. You can then display a message and close the report. This is easier than running a separate process to see if data exists for the report.

Avoid Unnecessary Property Assignments

Set only the properties that absolutely need to be set. Properties assignments can be relatively expensive in terms of performance. Review your form's startup code to ensure that you are not setting any form or control properties that don't need to be set.

Eliminate Unnecessary Reports

If a sub report is based on the same query as its parent report, or the query is similar, consider removing the sub report and placing its data in the main report. While this is not always feasible, such changes can speed up the overall report.

Limit the Number of Controls on Reports

Minimize the number of controls on your report. Loading controls is the biggest performance hit when loading a report.

Avoid Overlapping Controls

Avoid overlapping controls. It takes Access more time to render and draw controls that overlap each other than it does non-overlapping controls.

Minimize Bitmap Usage

Use bitmap and other graphic objects sparingly as they can take more time to load and display than other controls.

Use the Image Control

Use the Image control instead of unbound object frames to display bitmaps. The Image control is a faster and more efficient control type for graphic images.



Macros Tips

Convert Macros to Visual Basic Code

Convert macros to Visual Basic code. In almost all cases, Visual Basic code runs faster than macros.



Microsoft Access Module/VBA Performance Tips

Make an MDE File

If possible, make an MDE file out of your database. An MDE file cannot become decompiled, so your Visual Basic code always runs at top speed. Additionally, since no source code is stored in the MDE file, the database loads faster and uses less memory.

Achieve the Compiled State

Module code is saved in two states in your Access database: the source state, and the compiled state. The source state consists of the contents of your actual modules, with full text including white space, procedure and variable names, and comments. The compiled state is the executable version of your code. All comments and white space have been removed, and a stream of executable instructions has been produced-the code is ready to be run. The difference between these two states can cause your application to run slower than molasses in January if you don't understand them.

When you run a procedure, VBA checks to see if the module containing the procedure is compiled. If it is, VBA simply runs the code. If it is not compiled, VBA compiles it by loading the code into memory, performing a syntax check, and compiling it into an executable format. If all these operations succeed, it can then run the code. You probably surmised that this process of compiling is not a free lunch-it does take some time. And herein lies the crux of the matter: compiling code takes time, and compiling lots of code takes lots of time.

So if you want your database to run as fast as possible, your task is obviously to reduce the amount of time Access spends compiling your code to a bare minimum. In fact, in an ideal application, all your code should be compiled and saved in the compiled state. So how do you go about this? Your Access database (or project in VBA parlance) is said to be in a compiled state when all modules, including form and report modules, are saved in both states in the database. This means that the original source code is stored, as is the compiled version. In such a state, Access runs much faster, because it can completely bypass the compilation process.

Getting your database into the compiled state is actually rather easy:

  1. Open any module
  2. From the Debug Menu, select Compile and Save All Modules

Your database is now in the compiled state. This includes form and report modules (called class modules using Access terminology) and standard modules. All VBA code that is called by your application is immediately ready for execution. There is no need for compilation. This is all fine and well, but is just as easy for your database to become decompiled. When you make certain changes to your database, it automatically becomes decompiled, which means that the compiled state that you created using the previous steps no longer exists.

How to Avoid Decompilation

So how do you avoid decompilation, or loss of the compiled state? Any of the following actions can decompile your database:

  • Modify any module code
  • Make changes to code-bearing objects, such as form, reports and controls, or create such code-bearing objects

So the bottom line is: to avoid decompilation, don't do the above. Its not as bad as it seems. After all, your database does not need to be in a compiled state while you are doing development work on it-it only really requires the performance benefits of the compiled state when it is actually running on your user's workstations. Therefore, if you follow these guidelines, you can enjoy peak performance from your module code:

  • During development, don't use Compile All Modules. It is a waste of time, because the first time you make any changes to the module, it will decompile, or reverse the effect of Compile All Modules. Rather, use the Compile Loaded Modules option instead. This action only compiles the modules that are called by the modules you have open. This is a much quicker operation, and results in the same syntax checking that Compile All Modules does.
  • When you are ready to deliver your database for testing or live use, put it into the compiled state using the steps outlined above.

Decompile to Eliminate Old VBA Compiled States

The Compact feature only compacts the data, but not the code portion of an Access database. To flush the database of old code, shrink the size of your database, and avoid the infamous Bad DLL Calling Convention error, use the /decompile command occasionally. Read this reference for more information on the Microsoft Access decompile feature.

ActiveX Controls Should Have References

If you are using an ActiveX control, your database should have a Reference to the ActiveX (OCX) file. This allows you to use early binding to bind variables to the control's objects, making it load and run faster. In most cases, this is handled for you: when you insert an ActiveX control into a form or report, Access automatically creates a Reference for that control.

Use Option Explicit

Always explicitly declare variables. Ensure this happens in every module in your application by using the Option Explicit phrase at the top of each module.

Choose the Most Efficient Variable Type

Use the most efficient variable type possible when declaring variables. For example, don't use a Long Integer when an Integer will do. Avoid Variant types because the can be inefficient and slow.

Use Early Binding

Avoid using the Object data type. Instead use the specific data type for the object you are working with. This allows Visual Basic to employ "early binding" which can be substantially faster in many cases.

Assign Things to Object Variables

If you are going to refer to a property, control, object or data access object (DAO) more than once, assign it to an object variable.

Use the Me Keyword

Use the Me keyword instead of the Form!FormName form to refer to the form of a form's module.

Avoid the Immediate If Function if Parts Run Other Code

Use the IIf (immediate if) statement sparingly. IIf() does not employ "short-circuit" evaluation. This means that both sides of the expression are always evaluated, which may not be what you want since intuitively it looks like only the criteria satisfying side would run.

Use Dynamic Arrays

Instead of fixed arrays, use dynamic arrays with the Erase and ReDim statements to make better use of memory.

Take Advantage of Demand Loading

Organize and structure you modules to take advantage of Visual Basic's demand loading architecture. When a procedure is loaded from a module, the entire module is loaded into memory. By placing related procedures in the same module, you can reduce the number of loads the Visual Basic has to make.

Eliminate Dead Code

Eliminate unused procedures and unused variables. These elements use memory unnecessarily, and slow program load and execution. Our Total Access Analyzer program finds unused procedures, variables, and constants, and variables assigned but not used.

Use Constants Instead of Variables

If you are using data that is not going to change, put it in a constant instead of a variable. This allows Visual Basic to compile the value into the constant when the module is compiled, making the execution of that code faster.

Avoid Infinite Recursion

Avoid Infinite Recursion. Don't have code that can call itself without having some type of short-circuit mechanism. This can lead to "Out of Stack Space" errors.

Declare String Data Intelligently

Visual Basic allocates stack and heap memory differently according to the type of strings you create. By understanding how this works, you can write more efficient string code. String variables in procedures that are non-static use space on the computer's stack. Use the following information to write code that minimizes stack memory usage.

  • Local fixed-length strings less than or equal to 64 characters use 2 bytes for each character in the string. They don't use heap memory.
  • Local fixed-length strings longer than 64 characters use 4 bytes of stack memory for a pointer to the variable in heap memory and 2 bytes of heap memory for each character in the string.Local variable-length strings use 4 bytes of stack memory for a pointer to the variable in heap memory, and a variable amount of heap memory according to the length of the string.
  • If your code used a large number of fixed-length strings of 64 characters or less, you can reduce stack usage by changing the strings to local variable-length strings or making them static fixed-length strings.

Minimize OLE References

Every time you reference a Visual Basic object, method or property, you are initiating one or more calls the OLE's Idispatch interface. Each one of these calls takes time. Minimizing the number of such calls is one of the best ways to make you code run faster. You can minimize OLE references by doing the following:

  • Use object variables instead of directly referring to objects
  • Use the With statement and the For Each construct to minimize object references
  • Move references to properties and methods outside of loops
  • When you refer to a member of collection, do so with the object's index number. Referring to a collections member with a name or expression introduces extra work, and therefore, more time.

Turn Off Screen Painting

Turn off screen painting during repetitive operations that update the screen. Consider using the Application.Echo property to turn of screen painting. Depending on the type of video card in your computer, this can have moderate to dramatic effects of performance.

Don't Write Code When a Query Would Be Better

We've seen pages of VBA code written using various recordset operations when one or a few queries would do the trick. Queries are not only faster and optimizable, they're easier to understand and maintain. If you're not familiar with how to use Select, Update, Delete, and Append queries, learn them to best use Access or SQL Server.

Close Your Database a Couple of Times a Day

VBA dynamically loads code into memory as needed on a module level. If a function is called or a variable is used, the entire module containing that function or variable is loaded into memory. As you are developing your application, you keep loading code into memory. Visual Basic for Applications does not support dynamic unloading of these modules. Because of this, RAM will begin to fill up. To boost development performance (I.e. to decrease the amount of time you spend as a developer working on your application), you may want to close the database periodically to unload the modules. Note that you do not have to close Access itself, just the database itself. However, if you have library database code loaded, you should exit Access also.

It is especially important to close your database after a Compile All Modules command. The Compile All Modules command pulls all of your code into memory. Closing and reopening the application will unload the code and enable you to develop faster because of the additional free memory.

If you are developing your application in a single-user environment, you can improve your development performance by opening the application exclusively. This allows Visual Basic for Applications to save and compile faster by eliminating multiple-user save situations.

ActiveX Controls Should Have References

If you are using an ActiveX control, your database should have a Reference to the ActiveX (OCX) file. This allows you to use early binding to bind variables to the control's objects, making it load and run faster. In most cases, this is handled for you: when you insert an ActiveX control into a form or report, Access automatically creates a Reference for that control.

Don't Use Expressions To Determine Loop Boundaries

If use loop constructs in your VBA code, such as For...Next, Do...While, etc. don't force VBA to evaluate the boundaries of the loop each time. For example, instead of saying:

For intCounter = 0 To Forms.Count - 1
...
Next intCounter

use this:

intCount = Forms.Count - 1
For intCounter = 0 To intCount
...
Next intCounter

In the second example, VBA only has to determine the value of Forms.Count once. In the first, example, the value needs to be determined for each iteration of the loop.



Data Access Objects (DAO) Programming Tips

Use Seek Instead of Find

Use Seek instead of Find... whenever possible. It uses indexes more efficiently and is the fastest data access method.

Search Access Help for: Seek method

Use Bookmarks For Record Navigation

Whenever possible, use bookmarks to move among records instead of using the FindNext method. The Jet engine can navigate to bookmark values quicker than doing the sequential reads required by FindNext.

Use Indexed Fields for FindRecord and FindNext

If you can't use the Seek method, and must use the FindRecord or FindNext methods, use them on indexed fields. These methods are much more efficient when used on a field that is indexed.

Search Access Help for: Find methods

Don't Use Transactions Unless Necessary

Microsoft Access lets you wrap table update code in transactions so you can rollback incomplete attempts to save data. Any code operation that adds, modifies, or deletes data can be enclosed in a transaction using the BeginTrans...CommitTrans pair.

If you do not need to rollback your updates, you can avoid using transactions and the overhead of maintaining a rollback log.



Jet Engine Tuning Tips

Use SHOWPLAN To See Query Execution Plans

Microsoft Jet implements a cost-based query optimizer in its query engine. During the compilation process of the query, Jet determines the most effective way to execute the query. You can view this plan using the ShowPlan registry setting.

To use this setting, use the Registry Editor that comes with your operating system and add the following key to the registry:\\HKEY_LOCAL_MACHINE\SOFTWARE\MICROSOFT\Jet\3.0\Engines\Debug

Under this key, add a string data type entry named JETSHOWPLAN in all capital letters. To turn ShowPlan on, set the value of this new entry to "ON". To turn the feature off, set the value to "OFF". When the feature is on, a text file called SHOWPLAN.OUT is created (or appended to if it already exists) in the current directory. This file contains the query plans.

Tune Database Performance with Jet Registry Settings

Microsoft Jet lets you tune many new parameters to tweak the engine for the best possible performance. However, this was somewhat difficult because you could only modify these settings from their defaults by creating keys in the registry, setting them to new values, and restarting Access and/or Jet. Version 3.5 of the Jet engine makes this process a whole lot easier. These registry keys are located in:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\3.5\Engines\Jet 3.5.

Also, you can use the DAO SetOption method to set these parameters at runtime. Using SetOption causes Jet to apply the changes immediately-the changes are not stored permanently in the registry, and you do not have to restart Access. If you want to get the maximum performance out of your data operations, experiment with these settings.

For more information, search Access online help for "Customizing Windows Registry Settings for Data Access Objects."

Threads

Experiment with the Threads setting in the registry. With this setting you can increase or decrease the number of operating system threads available to the Jet database engine.

MaxBufferSize

Use the MaxBufferSize registry setting to tune the data buffer used by the Jet database engine.

UserCommitSync

Use the UserCommitSync registry setting to tune performance of explicit transactions.

ImplicitCommitSync

Use the ImplicitCommitSync registry setting to tune the performance of implicit transactions.

FlushTransactionTimeout

Use the FlushTransactionTimeout registry setting to tune the performance of asynchronous write operations.

ExclusiveAsyncDelay

Use the ExclusiveAsyncDelay registry setting to tune the performance of asynchronous mode writes to exclusively opened databases.

SharedAsyncDelay

Use the SharedAsyncDelay registry setting to tune the performance of asynchronous mode writes to databases opened for shared access.

PageTimeout

Use the PageTimeout registry setting to change the delay for checking other user's changes to the database.

LockDelay

Use the LockDelay registry setting to change how long Microsoft Jet waits between retries on locked pages in a shared database.

MaxLocksPerFile

Use the MaxLocksPerFile registry setting to tune performance by setting the maximum number of locks can be placed on a database (MDB) file. For more information, search Access online help for "Customizing Windows Registry Settings for Data Access Objects", or visit this Microsoft Knowledgebase article 815281.

RecycleLVs

Use the RecycleLVs registry setting to determine how to memo, OLE and hyperlink data pages are recycled.

Use ISAMStats To See Engine Detail

Microsoft Jet contains an undocumented function called ISAMStats that shows various internal values. The syntax of the function is:

ISAMStats ((StatNum As Long [, Reset As Boolean]) As Long

where StatNum is one of the following values:

0 - number of disk reads
1 - number of disk writes
2 - number of reads from cache
3 - number of reads from read-ahead cache
4 - number of locks placed
5 - number of release lock calls

For example, to see the number of disk reads, try the following code:

Dim lngReads As Long
lngReads = DBEngine.ISAMStats (0, False)
Debug.Print lngReads

To reset the number of disk reads back to 0, do this:

lngReads = DDBEngine.ISAMStats (0, True)



Microsoft Access Startup Tips

Remove Code From Your Startup Form

If you use a Startup form instead on an Autoexec macro, place the Visual Basic code needed for that form in the form's module instead of a standard module. Since Access has to load your Startup form, it will automatically load that form's module, which is generally faster than loading a standard module. This technique gives your application the appearance that it is loading faster.

Minimize Startup Form Code

Minimize the amount of code in your Startup form. You may want to defer certain operations, such as opening data access objects in code, or checking objects to a later time. This technique gives your application the appearance that it is loading faster.

Don't Use ActiveX Controls on Startup Form

Don't use ActiveX controls on your Startup Form. These controls can take long to load than other controls and will subsequently slow down the load time of your application.

Automate the Deployment of the Latest Version of Your Database

Centrally manage and simplify the automatic deployment of your Access application to each desktop. When you update your application, you want to distribute it quickly and easily. It's also important to make sure each user is using the right version of Access. Our Total Access Startup program makes it easy to manage your database version and Access version by letting you distribute a simple shortcut to launch your program.



Multiuser Performance Tips

Split Your Database

Split your database into an application and data database. Place only the data database on the server, keeping the application objects such as forms, reports and queries locally, where they can load and execute faster.

Keep Static Data Local

Keep static data, such as lookup tables, on the local machine. Update the local tables as necessary from the server. For example, a lookup table containing the two-letter abbreviations for American states is not likely to change anytime soon. When such a table is used in a data entry application, it is a performance bottleneck to retrieve that data from the server every time it is needed. Instead, copy that table to your application's local database.

Set Options to Avoid Lock Conflicts

Avoid lock conflicts by setting the Refresh Interval, Update Retry Interval, Number of Update Retries, and ODBC Refresh Interval Settings in the Tools, Options menu.

Tune Jet with the Registry

Investigate the registry settings available for Access and Microsoft Jet for tuning data engine parameters. Search Access online help for "Performance" for more information on these settings



Computer Performance Tips

Increase RAM

Increase the amount of RAM on your computer. Operating systems, Office, and other programs eat up a lot of memory. With the cost of memory so low, you should have at least 1 GB of RAM. With sufficient RAM, Access can perform its queries entirely in RAM without the need to write temporary data to disk.

Avoid NTFS Compression

If you are using NTFS compression under Windows NT, move your Access application to a non-compressed drive. NTFS adds a significant amount of overhead during disk operations, and Access applications are very disk-intensive.

Make Sure Network Speed is Maximized

Most networks and network cards support 10/100 Mbps standards. Make sure your machine is using the 100 and not the 10 if you're accessing data across your network.

Keep Enough Local Disk Space Free

Access requires a fair amount of disk space to operate, especially with large databases. Operations such as running large action queries, adding lots of data, importing data, and compiling and saving module code can use a lot of additional space on a temporary basis. Additionally, transactions, and compacting the database can use a lot of disk space. A good rule of thumb is to have roughly 5-10 times the size of your largest database available in free local storage.



Windows Performance Tips

Keep Databases Local

Whenever possible, keep your databases on a local drive instead of on a network drive. In almost all cases, local disk access is faster than network access.

Install Access Locally

Do not install Microsoft Access, or its related Microsoft Office components on a network. Most applications, including Access and Office perform better when run from local drives.

Open Databases Exclusively

If you are opening a database, and no other users need to use the database, open the database in exclusive mode. To do this, check the Exclusive checkbox in the Open Database dialog.

Close Unneeded Applications

Free up memory by closing applications that you aren't using. If you are using the System Agent from the Microsoft Windows Plus Pack, consider turning it off to make more memory available for your access application.

Optimize Your Disk

Keep your disk in optimal shape by deleting unneeded files and emptying your Recycle Bin. Use a disk defragmenter such as the one found in the Windows 95 Accessories, System tools menu.

Close Access Occasionally to Reclaim Leaked Memory

Almost all Windows applications "leak" memory. This is due to data structures, variables, and other memory consumers that are not correctly released by the application. By closing Access, you allow Windows to reclaim this leaked memory.

Install Windows Locally

The Windows operating system should be installed on a local fixed drive on your computer. Because of the number of components involved in running Windows, a network installation of Windows causes poor performance, especially for database applications.



Conclusion

Microsoft Access has provided an incredibly powerful tool for the rapid development of database systems. With this power comes complexity—and complexity can lead to less than desirable performance. Use the tips in this paper to get your Access applications into top shape.

Additional Resources

Here are some additional resources that may help you create faster Access databases and/or create them in less time:

Automated Database Analysis for Best Practices

For automated analysis of your databases, check out our Total Access Analyzer program which is designed to examine every object in your database in detail, provide documentation to help you understand what's in it, how the objects interact, and offer suggestions for applying Microsoft Access Best Practices. It finds hundreds of types of errors, suggestions, and performance tips specific to your database objects.

Simplify Writing, Taking Over, and Delivering VBA Module Code

If you want to write better VBA code and do it in less time, learn about our Total Visual CodeTools program which works with Access, Office, and VB6. Total Visual CodeTools includes code builders, tools to standardize existing code (indentations, variable names, adds error handling, etc.), and tools for you to deliver more robust solutions. Learn more about our Microsoft Access coding tools.

Microsoft Access Source Code Library

Why write all the code yourself? Get our Total Visual SourceBook code library with 85,000+ royalty-free lines of code you can insert into your Access, Office and VB6 projects. Written exclusively for this professional code library, there's code you won't find anywhere else. More info on Microsoft Access code.

Microsoft Consulting Services

FMS also offers custom software development services. If you're in over your head or just don't have time, contact our team to see how we can help you maintain, enhance, and/or migrate your Microsoft Access applications. Here's more on our Microsoft Access Consulting Services.

More Technical Papers

More Tips and Techniques

http://www.fmsinc.com/MicrosoftAccess/Performance.html

6/9/10

Don't sleep longer – sleep smarter

Worried that you don't get the fabled eight hours? That's your first mistake, says Dr Nerina Ramlakhan, who thinks it's quality, rather than quantity, that counts. She shares her tips for a better night's rest with Rob Sharp

Tuesday, 8 June 2010

Sleeping is no mean art," said Friedrich Nietzsche. "For its sake one must stay awake all day." Indeed, maximising slumber duration can be a complex process. Today's anxiety-ridden, deadline-heavy world can steal away our eight hours of heavenly rest and replace it with a night frustratedly gnawing our pillows.

One professional intent on helping us snooze is Dr Nerina Ramlakhan, the author of Tired But Wired, a book released last month that advises taking naps, relaxing and exercising to hibernate effectively and wake up zinging and refreshed.

"I was frustrated about the lack of information out there," says the sleep and energy coach. "Someone once asked me to recommend a good book on sleep and I couldn't find one. I wanted to debunk many myths about sleep. I've had sleep problems myself for many years and I know what it's like. It's awful to wake up not rested and have to face the day."

Britain is an insomniac nation: the NHS spent almost £36m on sleeping pills in 2008-2009, the most recent figures available. That was a 20 per cent increase on the previous year. Medics put the increase down to people's worries over the credit crunch and unemployment. Modern sleeping pills are less addictive, meaning doctors are more likely to prescribe them.

"The biggest myths are that we need seven or eight hours a day, or that we shouldn't wake up in the night," continues Ramlakhan. "But waking early in the morning is perfectly normal. Students can still function well in an exam if they don't sleep the previous night. I sometimes professionally advise football players and I tell them not to worry about sleep before a big game. Even if they don't manage to sleep perfectly they will still perform well." So what's the key to an effective slumber? Ramlakhan explains the dos and don'ts of catching Z's.

Diet

Not eating pre-bedtime is less important than you would believe: what is pivotal is breakfast. Eating at the right time conditions your body's metabolism to wake up and wind down. "There are these fallacies swirling around that not eating before bed, or not eating lettuce or tuna, can help you sleep," says Ramlakhan. "But it's more crucial that you eat breakfast first thing in the morning, in what I call a 'metabolic window'. It's a timeframe in which you can give your body an important message. It tells it that in your world there is an adequate supply of food, it can relax, and that it can fall into sleep mode when it needs to."

Duration

Margaret Thatcher famously boasted she only needed four hours of shut-eye. Such "role models" perpetuate the myth that there is a one-size-fits-all rule. "In my experience it's all about being attuned to your requirements at different times," continues the expert. Professional footballers training twice a day might need to rest more than sedentary types. "There may be times you need four hours," adds the sleep expert. "At other points that could increase to seven or eight. It's about awareness of your needs."

Power naps

Winston Churchill scrimped on sleep, but liked napping. "You're better well rested than well briefed," was his maxim, and that means taking the odd power nap – it helps us relax, even if we aren't descending into the deep sleep. It has a restorative effect on the body and can help us settle down when it is time to turn in.

Dreaming

Dreams have an important role in categorising memories. If we're going through a rough patch we can fear our nightmares – but maybe we should embrace them. They can give us clues about problems we need to address. "I encourage people to look on dreams as friends," says the expert. "I have a lot of clients in the 42 to 49 age bracket who are experiencing crisis of meaning in their lives and are dreaming a lot. It's their subconscious trying to tell them about the next direction their life should take. Sometimes that can be useful."

Locations

Humans are hunter-gatherers and instinctively need to feel safe before nodding off. "The nut and bolts aren't so important unless you're a sensitive sleeper," says Ramlakhan. "But if that's the case then there are various options. Some people benefit from white noise, or fans. Make sure you have effective blinds, or earplugs, if they help. It's all about reducing distractions and sending a message to your brain about your safety."

Exercise

Get the balance right. If you over-rest you can become lethargic, if you overexercise you could cause muscle strain and burn-out. While spending eight hours staring at a computer monitor can wear out your brain, exercise can make sure your body is ready to wind down when the moment comes. Exercise also produces mood-lightening endorphins. "It aids the production of the complex concoction of hormones – like adenosine and melatonins – which help us bed down and fall asleep at the end of a difficult day," concludes Dr Ramlakhan.

http://www.independent.co.uk/life-style/health-and-families/features/dont-sleep-longer-ndash-sleep-smarter-1994018.html

5/25/10

Driven to Distraction

Are your business problems making you insane? In his debut column, 37signals co-founder Jason Fried argues that one of the keys to success is to let your lazy side guide you.

I think of myself as wildly ambitious and unapologetically lazy. Though we've all heard about the good things that come from ambition, laziness gets a bad rap. That's unfortunate. I can attribute a healthy chunk of my success to the positive returns of laziness. Laziness has the best ROI in the business.

Let's start at the beginning. I launched my first real company, a Web design company called Spinfree, in 1996. It was a solo show: just me, a desk in my apartment, and some self-taught mediocre Web design skills. But it was all I needed. The jobs rolled in, and my clients were happy. I could pay the bills, stash away some savings, and work when and where I wanted.

But I wasn't happy. Rather than building confidence, I was accumulating doubt. As my business expanded, I grew nervous and self-conscious. I began to feel as if my accomplishments weren't enough, that I had to take things to "the next level." I thought if I didn't get there fast enough, I'd be bowled over by the competition.

When I bid on projects against larger design firms, I started saying "we" instead of "I" in an attempt to sound bigger. The proposals submitted by my rivals were long and shiny, so mine had to be longer and shinier. I even began badmouthing the competition -- people I'd never met. That's ugly.

The thing is, I didn't need to do any of these things. I thought I did, but I didn't. I was inventing problems. I was making things hard on myself.

How did I figure this out? Laziness. I got tired and let down my guard and wound up learning something important about myself: I love work, just not hard work. I think hard work is overrated. My goal is to do less hard work. And what's hard? Acting like someone else, writing elaborate proposals I don't believe in, and flinging mud at the competition. That's hard and horrible work.

So I put my laziness to work for me. Instead of long proposals, I wrote short ones. Instead of worrying about competitors, I ignored them. And here's what happened: My company got more work. I found better clients. I slept better. I woke up better. I was happier. And, most of all, running a business became a lot easier.

Fifteen years later, this continues to be the most important lesson I've learned as an entrepreneur: Most of the stuff you agonize about just doesn't matter. Truth is, things are pretty easy and straightforward -- until you make them hard and complicated.

This is the ethos that drives what we do at 37signals, the company I co-founded in 1999. We make simple Web-based collaboration software for small businesses and groups. We have millions of users -- and millions in profits -- but we're just 16 people. We don't act any bigger or smaller. We don't put on airs. We just are who we are.

We don't worry much about what the competition is doing. We don't worry about growing pains we don't have yet. We don't spend time on five-year plans and forecasts, because in my experience, they just don't matter.

We invent software, not problems. Real problems will find you; you don't need to invite fake ones to dinner.

Yet that's precisely what many business owners do. I spend a lot of my time speaking with entrepreneurs and entrepreneurs-to-be. They e-mail me, call me at the office, hit me up on Twitter, or introduce themselves at conferences and events. And for the most part, they have one thing in common: They're scared. Worried. Insecure. Just like I was.

It's easy to see why. Conventional business wisdom breeds paranoia. If you don't get big fast, you lose. If you don't obsess about the competition, you will be crushed. If you don't make long-term plans, you'll be staggering in the dark.

Come on. Conventional wisdom is tired, upset, groggy, scared, and a pain in the ass to work with. It doesn't have to be like this.

Instead of spending your time worrying about what could, might, or may happen, spend your time on what matters now. Are your customers thrilled with your service today? Is your inbox flooded with word-of-mouth referrals today? Do your employees love their jobs today? Can people find what they're looking for on your website today? Be honest with yourself. If the answers aren't satisfactory, then I'd suggest that you truly have something to worry about -- no matter how beautiful and comprehensive your business plan is.

Tomorrow. Eventually. Next quarter. Next year. Five years from now. Exit strategy. Throw these words away. They don't matter. Today is all you have in business. Tomorrow is just today again. Next week? Seven todays in a row. A month isn't 30 days. It's 30 todays.

I'm not suggesting you stop thinking about the future. I'm telling you to stop stressing about it. Go on, get lazy.

Jason Fried is co-founder of 37signals, a Chicago-based software firm, and co-author of the book Rework, which was published in March. This is his first column for Inc.

http://www.inc.com/magazine/20100401/driven-to-distraction.html


5/24/10

5 Secrets of Self-Made Millionaires

They’re just like you. But with lots of money.

When you think “millionaire,” what image comes to mind? For many of us, it’s a flashy Wall Street banker type who flies a private jet, collects cars and lives the kind of decadent lifestyle that would make Donald Trump proud.

But many modern millionaires live in middle-class neighborhoods, work full-time and shop in discount stores like the rest of us. What motivates them isn’t material possessions but the choices that money can bring: “For the rich, it’s not about getting more stuff. It’s about having the freedom to make almost any decision you want,” says T. Harv Eker, author of Secrets of the Millionaire Mind. Wealth means you can send your child to any school or quit a job you don’t like.

According to the Spectrem Wealth Study, an annual survey of America’s wealthy, there are more people living the good life than ever before—the number of millionaires nearly doubled in the last decade. And the rich are getting richer. To make it onto the Forbes 400 list of the richest Americans, a mere billionaire no longer makes the cut. This year you needed a net worth of at least $1.3 billion.



If more people are getting richer than ever, why shouldn’t you be one of them? Here, five people who have at least a million dollars in liquid assets share the secrets that helped them get there.

PLUS: 13 Things Your Financial Adviser Won't Tell You

1. Set your sights on where you’re going
Twenty years ago, Jeff Harris hardly seemed on the road to wealth. He was a college dropout who struggled to support his wife, DeAnn, and three kids, working as a grocery store clerk and at a junkyard where he melted scrap metal alongside convicts. “At times we were so broke that we washed our clothes in the bathtub because we couldn’t afford the Laundromat.” Now he’s a 49-year-old investment advisor and multimillionaire in York, South Carolina.

There was one big reason Jeff pulled ahead of the pack: He always knew he’d be rich. The reality is that 80 percent of Americans worth at least $5 million grew up in middle-class or lesser households, just like Jeff.

Wanting to be wealthy is a crucial first step. Says Eker, “The biggest obstacle to wealth is fear. People are afraid to think big, but if you think small, you’ll only achieve small things.”

PLUS: 17 Things Your Mother Wants You to Know

It all started for Jeff when he met a stockbroker at a Christmas party. “Talking to him, it felt like discovering fire,” he says. “I started reading books about investing during my breaks at the grocery store, and I began putting $25 a month in a mutual fund.” Next he taught a class at a local community college on investing. His students became his first clients, which led to his investment practice. “There were lots of struggles,” says Jeff, “but what got me through it was believing with all my heart that I would succeed.”

2. Educate yourself
When Steve Maxwell graduated from college, he had an engineering degree and a high-tech job—but he couldn’t balance his checkbook. “I took one finance class in college but dropped it to go on a ski trip,” says the 45-year-old father of three, who lives in Windsor, Colorado. “I actually had to go to my bank and ask them to teach me how to read my statement.”

One of the biggest obstacles to making money is not understanding it: Thousands of us avoid investing because we just don’t get it. But to make money, you must be financially literate. “It bothered me that I didn’t understand this stuff,” says Steve, “so I read books and magazines about money management and investing, and I asked every financial whiz I knew to explain things to me.”

PLUS: 6 Moneymaking Tips

He and his wife started applying the lessons: They made a point to live below their means. They never bought on impulse, always negotiated better deals (on their cars, cable bills, furniture) and stayed in their home long after they could afford a more expensive one. They also put 20 percent of their annual salary into investments.

Within ten years, they were millionaires, and people were coming to Steve for advice. “Someone would say, ‘I need to refinance my house—what should I do?’ A lot of times, I wouldn’t know the answer, but I’d go find it and learn something in the process,” he says.

In 2003, Steve quit his job to become part owner of a company that holds personal finance seminars for employees of corporations like Wal-Mart. He also started going to real estate investment seminars, and it’s paid off: He now owns $30 million worth of investment properties, including apartment complexes, a shopping mall and a quarry.

“I was an engineer who never thought this life was possible, but all it truly takes is a little self-education,” says Steve. “You can do anything once you understand the basics.”

PLUS: 17 French Restaurant Words You Need to Know

3. Passion pays off
In 1995, Jill Blashack Strahan and her husband were barely making ends meet. Like so many of us, Jill was eager to discover her purpose, so she splurged on a session with a life coach. “When I told her my goal was to make $30,000 a year, she said I was setting the bar too low. I needed to focus on my passion, not on the paycheck.”

Jill, who lives with her son in Alexandria, Minnesota, owned a gift basket company and earned just $15,000 a year. She noticed when she let potential buyers taste the food items, the baskets sold like crazy. Jill thought, Why not sell the food directly to customers in a fun setting?

PLUS: 15 Foods You Should Never Buy Again

With $6,000 in savings, a bank loan and a friend’s investment, Jill started packaging gourmet foods in a backyard shed and selling them at taste-testing parties. It wasn’t easy. “I remember sitting outside one day, thinking we were three months behind on our house payment, I had two employees I couldn’t pay, and I ought to get a real job. But then I thought, No, this is your dream. Recommit and get to work.”

She stuck with it, even after her husband died three years later. “I live by the law of abundance, meaning that even when there are challenges in life, I look for the win-win,” she says.

PLUS: 20 Secrets Your Waiter Won't Tell You

The positive attitude worked: Jill’s backyard company, Tastefully Simple, is now a direct-sales business, with $120 million in sales last year. And Jill was named one of the top 25 female business owners in North America by Fast Company magazine.

According to research by Thomas J. Stanley, author of The Millionaire Mind, over 80 percent of millionaires say they never would have been successful if their vocation wasn’t something they cared about.

Enhance your IQ with our new Word Power game!

4. Grow your money
Most of us know the never-ending cycle of living paycheck to paycheck. “The fastest way to get out of that pattern is to make extra money for the specific purpose of reinvesting in yourself,” says Loral Langemeier, author of The Millionaire Maker. In other words, earmark some money for the sole purpose of investing it in a place where it will grow dramatically—like a business or real estate.

There are endless ways to make extra money for investing—you just have to be willing to do the work. “Everyone has a marketable skill,” says Langemeier. “When I started out, I had a tutoring business, seeing clients in the morning before work and on my lunch break.”

A little moonlighting cash really can grow into a million. Twenty-five years ago, Rick Sikorski dreamed of owning a personal training business. “I rented a tiny studio where I charged $15 an hour,” he says. When money started trickling in, he squirreled it away instead of spending it, putting it all back into the business. Rick’s 400-square-foot studio is now Fitness Together, a franchise based in Highlands Ranch, Colorado, with more than 360 locations worldwide. And he’s worth over $40 million.

PLUS: 10 Smart Money Moves to Make Now

When extra money rolls in, it’s easy to think, Now I can buy that new TV. But if you want to get rich, you need to pay yourself first, by putting money where it will work hard for you—whether that’s in your retirement fund, a side business or investments like real estate.

5. No guts, no glory
Last summer, Dave Lindahl footed the bill for 18 relatives at a fancy mansion in the Adirondacks. One night, his dad looked out at the scenery and joked, “I can’t believe we used to call you the black sheep!”

At 29, Dave was broke, living in a small apartment near Boston and wondering what to do after ten years in a local rock band. “I looked around and thought, If I don’t do something, I’ll be stuck here forever.”

He started a landscape company, buying his equipment on credit. When business literally froze over that winter, a banker friend asked if he’d like to renovate a foreclosed home. “I’m a terrible carpenter, but I needed the money, so I went to some free seminars at Home Depot and figured it out as I went,” he says.

Download our Life IQ trivia game for your iPhone.

After a few more renovations, it occurred to him: Why not buy the homes and sell them for profit? He took a risk and bought his first property. Using the proceeds, he bought another, and another. Twelve years later, he owns apartment buildings, worth $143 million, in eight states.

The Biggest Secret? Stop spending.
Every millionaire we spoke to has one thing in common: Not a single one spends needlessly. Real estate investor Dave Lindahl drives a Ford Explorer and says his middle-class neighbors would be shocked to learn how much he’s worth. Fitness mogul Rick Sikorski can’t fathom why anyone would buy bottled water. Steve Maxwell, the finance teacher, looked at a $1.5 million home but decided to buy one for half the price because “a house with double the cost wouldn’t give me double the enjoyment.”


http://shine.yahoo.com/event/financiallyfit/5-secrets-of-self-made-millionaires-1370279/

5/18/10

Examples for Exporting to EXCEL Workbook Files

Create and Export a Parameter Query to EXCEL file via TransferSpreadsheet (VBA)

Create a Query and Export multiple "filtered" versions of a Query (based on data in another table) to separate EXCEL files via TransferSpreadsheet (VBA)

Create a Query and Export multiple "filtered" versions of a Query (based on data in another table) to separate Worksheets within one EXCEL file via TransferSpreadsheet (VBA)

Write Data From a Recordset into an EXCEL Worksheet using Automation (VBA)

Write Data From a Recordset into an EXCEL Worksheet using EXCEL's CopyFromRecordset (VBA)

Browse to a single EXCEL File and Export Data to that EXCEL File via TransferSpreadsheet (VBA)

Browse to a single Folder and Export Data to a New EXCEL File in that Folder via TransferSpreadsheet (VBA)

Using the Range Argument of TransferSpreadsheet when Exporting Data to an EXCEL File (VBA)

Create and Export a Parameter Query to EXCEL file via TransferSpreadsheet (VBA)

Generic code to generate "on the fly" a query that uses one or more controls on an open form as parameters, and then export that query to an EXCEL file. This example concatenates the parameter values into the generated SQL statement and then saves the query so that it can be exported. The query then is deleted after the export is completed.

Dim dbs As DAO.Database
Dim qdfTemp As DAO.QueryDef
Dim strSQL As String, strQDF As String
Set dbs = CurrentDb

' Replace NameOfTableOrQuery with the real name of the table or query,
' replace NameOfForm with the real name of the form, and replace
' ADateControlOnForm and AnotherDateControlOnForm with the real names
' of the controls on that form

strSQL = "SELECT NameOfTableOrQuery.* FROM NameOfTableOrQuery " & _
"WHERE NameOfTableOrQuery.FieldName >= " & _
Format(Forms!NameOfForm!ADateControlOnForm.Value,"\#mm\/dd\/yyyy\#") & _
" And NameOfTableOrQuery.FieldName <=" & _
Format(Forms!NameOfForm!AnotherDateControlOnForm.Value,"\#mm\/dd\/yyyy\#") & "';"

strQDF = "_TempQuery_"
Set qdfTemp = dbs.CreateQueryDef(strQDF, strSQL)
qdfTemp.Close
Set qdfTemp = Nothing

' Replace C:\MyFolderName\MyFileName.xls with the real path and filename for the
' EXCEL file that is to contain the exported data

DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel9, _
strQDF,"C:\MyFolderName\MyFileName.xls"

dbs.QueryDefs.Delete strQDF
dbs.Close
Set dbs = Nothing

Return to Top of Page

Return to EXCEL Main Page

Return to Home

Create a Query and Export multiple "filtered" versions of a Query (based on data in another table) to separate EXCEL files via TransferSpreadsheet (VBA)

Generic code to create a temporary query, get list of filtering values, and then loop through the list to filter various data and export each filtered query to separate EXCEL files. In this sample code, the employees assigned to each manager are exported to separate EXCEL files, one file for each manager.

Dim qdf As DAO.QueryDef
Dim dbs As DAO.Database
Dim rstMgr As DAO.Recordset
Dim strSQL As String, strTemp As String, strMgr As String

Const strQName As String = "zExportQuery"

Set dbs = CurrentDb

' Create temporary query that will be used for exporting data;
' we give it a dummy SQL statement initially (this name will
' be changed by the code to conform to each manager's identification)

strTemp = dbs.TableDefs(0).Name
strSQL = "SELECT * FROM [" & strTemp & "] WHERE 1=0;"
Set qdf = dbs.CreateQueryDef(strQName, strSQL)
qdf.Close
strTemp = strQName

' *** code to set strSQL needs to be changed to conform to your
' *** database design -- ManagerID and EmployeesTable need to
' *** be changed to your table and field names
' Get list of ManagerID values -- note: replace my generic table and field names
' with the real names of the EmployeesTable table and the ManagerID field

strSQL = "SELECT DISTINCT ManagerID FROM EmployeesTable;"
Set rstMgr = dbs.OpenRecordset(strSQL, dbOpenDynaset, dbReadOnly)

' Now loop through list of ManagerID values and create a query for each ManagerID
' so that the data can be exported -- the code assumes that the actual names
' of the managers are in a lookup table -- again, replace generic names with
' real names of tables and fields
If rstMgr.EOF = False And rstMgr.BOF = False Then
rstMgr.MoveFirst
Do While rstMgr.EOF = False

' *** code to set strMgr needs to be changed to conform to your
' *** database design -- ManagerNameField, ManagersTable, and
' *** ManagerID need to be changed to your table and field names
' *** be changed to your table and field names

strMgr = DLookup("ManagerNameField", "ManagersTable", _
"ManagerID = " & rstMgr!ManagerID.Value)

' *** code to set strSQL needs to be changed to conform to your
' *** database design -- ManagerID and EmployeesTable need to
' *** be changed to your table and field names

strSQL = "SELECT * FROM EmployeesTable WHERE " & _
"ManagerID = " & rstMgr!ManagerID.Value & ";"
Set qdf = dbs.QueryDefs(strTemp)
qdf.Name = "q_" & strMgr
strTemp = qdf.Name
qdf.SQL = strSQL
qdf.Close
Set qdf = Nothing

' Replace C:\FolderName\ with actual path
DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel9, _
strTemp, "C:\FolderName\" & strMgr & Format(Now(), _
"ddMMMyyy_hhnn") & ".xls"
rstMgr.MoveNext
Loop
End If

rstMgr.Close
Set rstMgr = Nothing

dbs.QueryDefs.Delete strTemp
dbs.Close
Set dbs = Nothing

Return to Top of Page

Return to EXCEL Main Page

Return to Home

Create a Query and Export multiple "filtered" versions of a Query (based on data in another table) to separate Worksheets within one EXCEL file via TransferSpreadsheet (VBA)

Generic code to create a temporary query, get list of filtering values, and then loop through the list to filter various data and export each filtered query to separate EXCEL files. In this sample code, the employees assigned to each manager are exported to separate worksheets within the same EXCEL file, one worksheet for each manager.

Dim qdf As DAO.QueryDef
Dim dbs As DAO.Database
Dim rstMgr As DAO.Recordset
Dim strSQL As String, strTemp As String, strMgr As String

' Replace PutEXCELFileNameHereWithoutdotxls with actual EXCEL
' filename without the .xls extension
' (for example, MyEXCELFileName, BUT NOT MyEXCELFileName.xls)

Const strFileName As String = "PutEXCELFileNameHereWithoutdotxls"

Const strQName As String = "zExportQuery"

Set dbs = CurrentDb

' Create temporary query that will be used for exporting data;
' we give it a dummy SQL statement initially (this name will
' be changed by the code to conform to each manager's identification)

strTemp = dbs.TableDefs(0).Name
strSQL = "SELECT * FROM [" & strTemp & "] WHERE 1=0;"
Set qdf = dbs.CreateQueryDef(strQName, strSQL)
qdf.Close
strTemp = strQName

' *** code to set strSQL needs to be changed to conform to your
' *** database design -- ManagerID and EmployeesTable need to
' *** be changed to your table and field names
' Get list of ManagerID values -- note: replace my generic table and field names
' with the real names of the EmployeesTable table and the ManagerID field

strSQL = "SELECT DISTINCT ManagerID FROM EmployeesTable;"
Set rstMgr = dbs.OpenRecordset(strSQL, dbOpenDynaset, dbReadOnly)

' Now loop through list of ManagerID values and create a query for each ManagerID
' so that the data can be exported -- the code assumes that the actual names
' of the managers are in a lookup table -- again, replace generic names with
' real names of tables and fields

If rstMgr.EOF = False And rstMgr.BOF = False Then
rstMgr.MoveFirst
Do While rstMgr.EOF = False

' *** code to set strMgr needs to be changed to conform to your
' *** database design -- ManagerNameField, ManagersTable, and
' *** ManagerID need to be changed to your table and field names
' *** be changed to your table and field names

strMgr = DLookup("ManagerNameField", "ManagersTable", _
"ManagerID = " & rstMgr!ManagerID.Value)

' *** code to set strSQL needs to be changed to conform to your
' *** database design -- ManagerID, EmployeesTable need to
' *** be changed to your table and field names

strSQL = "SELECT * FROM EmployeesTable WHERE " & _
"ManagerID = " & rstMgr!ManagerID.Value & ";"
Set qdf = dbs.QueryDefs(strTemp)
qdf.Name = "q_" & strMgr
strTemp = qdf.Name
qdf.SQL = strSQL
qdf.Close
Set qdf = Nothing

' Replace C:\FolderName\ with actual path
DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel9, _
strTemp, "C:\FolderName\" & strFileName & ".xls"
rstMgr.MoveNext
Loop
End If

rstMgr.Close
Set rstMgr = Nothing

dbs.QueryDefs.Delete strTemp
dbs.Close
Set dbs = Nothing

Return to Top of Page

Return to EXCEL Main Page

Return to Home

Write Data From a Recordset into an EXCEL Worksheet using Automation (VBA)

Generic code to open a recordset for the data that are to be written into a worksheet in an EXCEL file (for this example, the EXCEL file must already exist, and the worksheet must already exist in the EXCEL file), and then to loop through the recordset and write each field's value into a cell in the worksheet, with each record being written into a separate row in the worksheet. The starting cell for the EXCEL worksheet is specified in the code; after that, the data are written into contiguous cells and rows. This code example uses "late binding" for the EXCEL automation.

Dim lngColumn As Long
Dim xlx As Object, xlw As Object, xls As Object, xlc As Object
Dim dbs As DAO.Database
Dim rst As DAO.Recordset
Dim blnEXCEL As Boolean, blnHeaderRow As Boolean

blnEXCEL = False

' Replace True with False if you do not want the first row of
' the worksheet to be a header row (the names of the fields
' from the recordset)

blnHeaderRow = True

' Establish an EXCEL application object
On Error Resume Next
Set xlx = GetObject(, "Excel.Application")
If Err.Number <> 0 Then
Set xlx = CreateObject("Excel.Application")
blnEXCEL = True
End If
Err.Clear
On Error GoTo 0

' Change True to False if you do not want the workbook to be
' visible when the code is running

xlx.Visible = True

' Replace C:\Filename.xls with the actual path and filename
' of the EXCEL file into which you will write the data

Set xlw = xlx.Workbooks.Open("C:\Filename.xls")

' Replace WorksheetName with the actual name of the worksheet
' in the EXCEL file
' (note that the worksheet must already be in the EXCEL file)

Set xls = xlw.Worksheets("WorksheetName")

' Replace A1 with the cell reference into which the first data value
' is to be written

Set xlc = xls.Range("A1") ' this is the first cell into which data go

Set dbs = CurrentDb()

' Replace QueryOrTableName with the real name of the table or query
' whose data are to be written into the worksheet

Set rst = dbs.OpenRecordset("QueryOrTableName", dbOpenDynaset, dbReadOnly)

If rst.EOF = False And rst.BOF = False Then

rst.MoveFirst

If blnHeaderRow = True Then
For lngColumn = 0 To rst.Fields.Count - 1
xlc.Offset(0, lngColumn).Value = rst.Fields(lngColumn).Name
Next lngColumn
Set xlc = xlc.Offset(1,0)
End If

' write data to worksheet
Do While rst.EOF = False
For lngColumn = 0 To rst.Fields.Count - 1
xlc.Offset(0, lngColumn).Value = rst.Fields(lngColumn).Value
Next lngColumn
rst.MoveNext
Set xlc = xlc.Offset(1,0)
Loop

End If

rst.Close
Set rst = Nothing

dbs.Close
Set dbs = Nothing

' Close the EXCEL file while saving the file, and clean up the EXCEL objects
Set xlc = Nothing
Set xls = Nothing
xlw.Close True ' close the EXCEL file and save the new data
Set xlw = Nothing
If blnEXCEL = True Then xlx.Quit
Set xlx = Nothing

Return to Top of Page

Return to EXCEL Main Page

Return to Home

Write Data From a Recordset into an EXCEL Worksheet using EXCEL's CopyFromRecordset (VBA)

Generic code to open a recordset for the data that are to be written into a worksheet in an EXCEL file (for this example, the EXCEL file does not already exist), and then to use EXCEL's CopyFromRecordset method to copy the data from the recordset into the first worksheet in that EXCEL file, with each record being written into a separate row in the worksheet. The code allows for a header row to be created in the worksheet if this is desired. This code example uses "late binding" for the EXCEL automation.

Dim lngColumn As Long
Dim xlx As Object, xlw As Object, xls As Object, xlc As Object
Dim dbs As DAO.Database
Dim rst As DAO.Recordset
Dim strPathFileName As String, strWorksheetName As String
Dim strRecordsetDataSource As String
Dim blnEXCEL As Boolean, blnHeaderRow As Boolean

blnEXCEL = False

' Replace C:\Filename.xls with the actual path and filename
' that will be used to save the new EXCEL file into which you
' will write the data

strPathFileName = "C:\Filename.xls"

' Replace QueryOrTableName with the real name of the table or query
' whose data are to be written into the worksheet

strRecordsetDataSource = "QueryOrTableName"

' Replace True with False if you do not want the first row of
' the worksheet to be a header row (the names of the fields
' from the recordset)

blnHeaderRow = True

' Establish an EXCEL application object
On Error Resume Next
Set xlx = GetObject(, "Excel.Application")
If Err.Number <> 0 Then
Set xlx = CreateObject("Excel.Application")
blnEXCEL = True
End If
Err.Clear
On Error GoTo 0

' Change True to False if you do not want the workbook to be
' visible when the code is running

xlx.Visible = True

' Create a new EXCEL workbook
Set xlw = xlx.Workbooks.Add

' Rename the first worksheet in the EXCEL file to be the first 31
' characters of the string in the strRecordsetDataSource variable

Set xls = xlw.Worksheets(1)
xls.Name = Trim(Left(strRecordsetDataSource, 31))

' Replace A1 with the cell reference of the first cell into which the
' headers will be written (blnHeaderRow = True), or into which the data
' values will be written (blnHeaderRow = False)

Set xlc = xls.Range("A1")

Set dbs = CurrentDb()

Set rst = dbs.OpenRecordset(strRecordsetDataSource, dbOpenDynaset, dbReadOnly)

If rst.EOF = False And rst.BOF = False Then
' Write the header row to worksheet
If blnHeaderRow = True Then
For lngColumn = 0 To rst.Fields.Count - 1
xlc.Offset(0, lngColumn).Value = rst.Fields(lngColumn).Name
Next lngColumn
Set xlc = xlc.Offset(1,0)
End If

' copy the recordset's data to worksheet
xlc.CopyFromRecordset rst
End If

rst.Close
Set rst = Nothing
dbs.Close
Set dbs = Nothing

' Save and close the EXCEL file, and clean up the EXCEL objects
Set xlc = Nothing
Set xls = Nothing
xlw.SaveAs strPathFileName
xlw.Close False
Set xlw = Nothing
If blnEXCEL = True Then xlx.Quit
Set xlx = Nothing

Return to Top of Page

Return to EXCEL Main Page

Return to Home

Browse to a single EXCEL File and Export Data to that EXCEL File via TransferSpreadsheet (VBA)

Generic code to browse to a single EXCEL file, and then to export the data to that EXCEL file. This generic method uses the Windows API to browse to a single file the code for this API (which was written by Ken Getz) is located at The ACCESS Web ( www.mvps.org/access ).

First step is to paste all the Getz code (from http://www.mvps.org/access/api/api0001.htm ) into a new, regular module in your database. Be sure to give the module a unique name (i.e., it cannot have the same name as any other module, any other function, or any other subroutine in the database). Then use this generic code to allow the user to select the EXCEL file to which the data are to be exported.

Dim strPathFile As String
Dim strTable As String, strBrowseMsg As String
Dim strFilter As String, strInitialDirectory As String
Dim blnHasFieldNames As Boolean

strBrowseMsg = "Select the EXCEL file:"

' Change C:\MyFolder\ to the path for the folder where the Browse
' window is to start (the initial directory). If you want to start in
' ACCESS' default folder, delete C:\MyFolder\ from the code line,
' leaving an empty string as the value being set as the initial
' directory

strInitialDirectory = "C:\MyFolder\"

strFilter = ahtAddFilterItem(strFilter, "Excel Files (*.xls)", "*.xls")

strPathFile = ahtCommonFileOpenSave(InitialDir:=strInitialDirectory, _
Filter:=strFilter, OpenFile:=False, _
DialogTitle:=strBrowseMsg, _
Flags:=ahtOFN_HIDEREADONLY)

If strPathFile = "" Then
MsgBox "No file was selected.", vbOK, "No Selection"
Exit Sub
End If

' Replace tablename with the real name of the table from which
' the data are to be exported

strTable = "tablename"

DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel9, _
strTable, strPathFile

Return to Top of Page

Return to EXCEL Main Page

Return to Home

Browse to a single Folder and Export Data to a New EXCEL File in that Folder via TransferSpreadsheet (VBA)

Generic code to browse to a single folder, and then to import the data from the first (or only) worksheet in all EXCEL files that are located within that folder. All of the EXCEL files' worksheets must have the data in the same layout and format. This generic method uses the Windows API to browse to a single folder; the code for this API (which was written by Terry Kreft) is located at The ACCESS Web ( www.mvps.org/access ).

First step is to paste all the Kreft code (from http://www.mvps.org/access/api/api0002.htm ) into a new, regular module in your database. Be sure to give the module a unique name (i.e., it cannot have the same name as any other module, any other function, or any other subroutine in the database). Then use this generic code to allow the user to select the folder in which the EXCEL files are located.

Dim strPathFile As String, strFile As String, strPath As String
Dim strTable As String, strBrowseMsg As String
Dim blnHasFieldNames As Boolean

strBrowseMsg = "Select the folder where the new EXCEL file will be created:"

strPath = BrowseFolder(strBrowseMsg)

If strPath = "" Then
MsgBox "No folder was selected.", vbOK, "No Selection"
Exit Sub
End If

' Replace filename.xls with the real name of the EXCEL file
' that is to be created and into which the data are to be
' exported

strFile = "filename.xls"

' Replace tablename with the real name of the table from which
' the data are to be exported

strTable = "tablename"

strPathFile = strPath & "\" & strFile

DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel9, _
strTable, strPathFile

Return to Top of Page

Return to EXCEL Main Page

Return to Home

Using the Range Argument of TransferSpreadsheet when Exporting Data to an EXCEL File (VBA)

After I read many posts in the newgroups about using the Range argument for exporting queries/tables to EXCEL file, I decided to do some testing to figure out what actually works and what doesn't work when using this argument (NOTE that the use of the Range argument for exports is an UNDOCUMENTED feature in ACCESS).

Here are the results of my tests for your information and entertainment. My tests were done with ACCESS 2003 and EXCEL 2003.

EXCEL FILE DOES NOT ALREADY EXIST
---------------------------------------------------------------

If the EXCEL file will be created by TransferSpreadsheet, the Range argument
can be used to create a range in the new file that describes the cells that
contain the exported data on the worksheet. This Range argument also is used
to name the worksheet onto which the exported data are written. This
overrides the normal operation of TransferSpreadsheet, which is to name the
worksheet using the name of the table or query being exported. For example,
this action:

DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel9,
"QueryName","C:\Test.xls",, "MyName"

will create the file "C:\Test.xls" and the data will be written onto the
first worksheet, which will be named MyName (the Worksheet.Name property,
not the Worksheet.CodeName property); and the cells into which the data are
written will be a Range named MyName in the new file. This range will
include the field names that are exported as the first row of data, and the
range will begin in cell A1.



EXCEL FILE ALREADY EXISTS
------------------------------------------------

The Range argument can be used to identify the actual Range into which the
exported data are written. TransferSpreadsheet ignores worksheet names when
looking for the Range in the workbook file. It looks specifically for a
defined Range of cells.

However, it is in this situation where I believe many posters have run into
problems with this undocumented feature.

1) If the Range exists (cell range, that is) AND if that range encompasses
more than a single cell (at least two cells), the data are exported to that
range of cells. If the number of records and/or fields are more or fewer
than the "size" of the range (number of rows and columns), the data are
correctly exported and the Range is redefined to match the size of the
exported data in terms of width and depth of the range (number of rows and
number of columns). Note that any formatting in the cells within this range
is retained (e.g., Bold, Highlight color, font color, etc.).


2) If the Range does not exist in the workbook file, TransferSpreadsheet
creates a new worksheet, names it with the Range argument value, writes the
data onto that worksheet, and creates a new Range (also named with the Range
argument value) to define the cells that contain the exported data. If a
worksheet with the same name as what is in the Range argument already exists
in the workbook file, the new worksheet that is created is named using
standard EXCEL process, namely, the Range argument name followed by a 1.
Thus, if I use MyName as the Range argument and export to an existing file,
I can get one of the following results:

a) File already contains a worksheet named MyName but does not
contain a Range named MyName: A new worksheet named MyName1 is created, the
data are written onto that worksheet, and a new Range named MyName is
defined for the cells that received those exported data.

b) File does not contain a worksheet named MyName and does not
contain a Range named MyName: A new worksheet named MyName is created, the
data are written onto that worksheet, and a new Range named MyName is
defined for the cells that received those exported data.


3) If the Range exists (cell range, that is) AND if the Range consists of
a single cell (e.g., A1), then strange things happen -- note that it doesn't
matter if the Range starts in cell A1 or not. And because of these strange
things, this is where the feature is unusable for exporting. I haven't
defined exact "rules" to describe what happens (although it appears that how
far the range is moved appears to be "the original row number plus 93"
columns (if the Range was originally in column A), but here are my
observations in this situation (I won't guarantee that you won't see
different behaviors):

a) If the worksheet name is the same name as the Range name, and
the Range begins in cell A1, the exported data are written to the worksheet
that contains the Range specified in the TransferSpreadsheet action, and
these data begin at cell A1 (with the field names row) -- BUT the existing
range is moved to cell CQ1 (94 columns to the right), and there is no Range
created for the cells that contain the exported data. Any further attempt to
export to this worksheet using the same Range argument generates an error
because the "move" of the range will extend beyond the column limit of the
worksheet.

b) If the worksheet name is the same name as the Range name, and
the Range begins in cell A5, the exported data are written to the worksheet
that contains the Range specified in the TransferSpreadsheet action, and
these data begin at cell E5 (with the field names row) -- BUT the existing
range is moved to cell CU5 (98 columns to the right), and there is no Range
created for the cells that contain the exported data. Any further attempt to
export to this worksheet using the same Range argument generates an error
because the "move" of the range will extend beyond the column limit of the
worksheet.

c) If the worksheet name is not the same as the Range name, and
the Range begins in cell A1, the exported data are written to a new
worksheet that is named the same as the Range argument value, and the
existing Range is then moved to cell IV1 (the last column in the sheet) on
that new worksheet, and there is no Range created for the cells that contain
the exported data.


http://www.accessmvp.com/KDSnell/EXCEL_Export.htm

Posting Levels in Dynamics GP – A Review

I have been getting questions from many of my clients and network on the various levels of postings to the General Ledger accounts that are available in Dynamics GP. Hence I decided to provide some information on the same to all the folks in this community.

The level of posting to the GL accounts is determined from couple of setups working hand in hand in Dynamics GP which I will elaborate below. The initial setup is to define the level of postings for various accounts in the Account Maintenance window from Cards >> Financials >> Accounts as illustrated below.

image image

We can specify the levels of posting from the various series into the General Ledger module. The various options available for posting levels are

  • Detail
  • Summary

Once this has been setup, we need to define the level of posting for various transactions in various modules in the Posting Setup window from Microsoft Dynamics GP >> Tools >> Setup >> Posting >> Posting as illustrated below.

image

So for the purpose of this case study, I created a couple of receivable invoices with the distributions explained below and saved them into a batch called RECVINV.

Invoice #1

Invoice NumberAccount NumberTypeDebit (DR)Credit (CR)
INV0001000-1200-00RECV$500-
INV0001000-4100-00SALES-$500

Invoice #2

Invoice NumberAccount NumberTypeDebit (DR)Credit (CR)
INV0001000-1200-00RECV$700-
INV0001000-4100-00SALES-$700

And I have setup the account 000-1200-00 to post at a Summary Level in the Sales Series in the Account Maintenance window, whereas I have setup the account 000-4100-00 to post in detail in the Sales Series in the Account Maintenance window.

Now, in the Posting Setup window, if we select the option to create one journal entry per transaction, posting will always be done at a detailed level (irrespective of the setting specified in the Account Maintenance window) (i.e.) There will be a one-to-one match between the distribution lines in the journal entry and the distribution that we had noted in the transaction posted in the sub ledgers.

So when the above batch RECVINV is posted in the Receivables module, there will be 2 journal entries created with the distributions explained below.

Journal Entry #1

Journal EntryAccount NumberDebit (DR)Credit (CR)
12345000-1200-00$500-
12345000-4100-00-$500

Journal Entry #2

Journal EntryAccount NumberDebit (DR)Credit (CR)
12346000-1200-00$700-
12346000-4100-00-$700

However in the above window, if we specify the option to create a journal entry per batch, we have two levels of roll ups that are available when journal entries are created when the sub ledger transactions are posted.

If the “Use Account Settings” option is unchecked, then when a batch of transactions is posted from the sub ledger module, the system creates one journal entry for all transactions posted in the sub ledger batch. However, there is no roll-up done at the account level, even though the accounts have been setup to post at summary level in the Account Maintenance window.

So in the same case study example above, if the batch was posted in the Receivables module, there will be one journal entry created with the distributions illustrated below.

Journal Entry #1

Journal EntryAccount NumberDebit (DR)Credit (CR)
12345000-1200-00$500-
12345000-4100-00-$500
12345000-1200-00$700-
12345000-4100-00-$700

If the “Use Account Settings” option is checked, then when a batch of transactions is posted from the sub ledger module, the system creates one journal entry for all transactions posted in the sub ledger batch and the distribution amounts are rolled based on the posting levels for the accounts that are defined in the Account Maintenance window.

So in the same case study example above, if the batch was posted in the Receivables module, there will be one journal entry created with the distributions illustrated below.

Journal Entry #1

Journal EntryAccount NumberDebit (DR)Credit (CR)
12345000-1200-00$1200-
12345000-4100-00-$500
12345000-4100-00-$700

Note: Keep in mind that the various levels of postings will also be in effect only when we perform a batch posting in the sub ledger. If the posting is done at a transaction level, the system will always post in detail to the General Ledger (irrespective if the settings in the Account Maintenance window and the Posting Setup window for the specific sub ledger transaction).

Hope this article provides some insight into the General Ledger posting levels that are available in Dynamics GP.

http://cvakumar.com/msdynamics/2010/05/16/posting-levels-in-dynamics-gp-a-review/

Maintaining Data Integrity between Sub Ledgers and General Ledger

I have been involved in the process of reconciliation between the sub ledgers and the general ledger at various clients and there has been various scenarios in which there has been a break between the sub ledgers and the general ledger balances. A few key scenarios are quoted below

  1. A transaction posted in the sub ledgers do not have a corresponding transaction in the general ledger.
  2. A transaction posted in the sub ledger is backed out/corrected at the general ledger level.
  3. Manual posting to the sub ledger control accounts in the General Ledger.

In this article, I am going to provide some tips to avoid any of the above situations and ensure that there is data integrity between the sub ledgers and the general ledger. This will ensure that the periodic audits done in the system proceeds in a smooth manner to a great extent.

The first two errors listed above can be prevented by disabling the following options in the General Ledger setup window from Microsoft Dynamics GP >> Tools >> Setup >> Financials >> General Ledger.

image

By unmarking the option “Deletion of Saved Transactions”, a journal entry batch that is created when a sub ledger transaction is posted, cannot be deleted at the General Ledger level. This prevents any deletion of a journal entry that was created from a sub ledger transaction. Note that if we have enabled “Post Through General Ledger” in the posting setup, this would not make a big difference, since whenever a transaction is posted in the sub ledger, the corresponding general ledger transactions get automatically posted as well.

By unmarking the option “Voiding/Correcting of Subsidiary Transactions”, the system will not allow us to void a journal entry batch that is created when a sub ledger transaction is posted. This prevents any voiding of a journal entry that was created from a sub ledger transaction. that if we have enabled “Post Through General Ledger” in the posting setup, this would not make a big difference, since whenever a transaction is posted in the sub ledger, the corresponding general ledger transactions get automatically posted as well. Further, this option will not allow the user to back-out (or) back-out and correct a journal entry that was posted for a sub ledger transaction.

The third error listed above can be eliminated by disallowing account entry for all the sub ledger control accounts like the GL accounts for Accounts Receivable, Accounts Payable and Inventory accounts, in the Account Maintenance window from Cards >> Financial >> Accounts, as shown below. We need to unmark the option “Allow Account Entry” for all the control accounts needed.

image

Unmarking this option for the key control accounts, will prevent the user from picking up this account in any transaction entry windows and will prompt the message as shown below.

image

This option will prevent any direct posting into the sub ledger control accounts in the General Ledger. All postings will happen only from the sub ledger module.

These options mentioned above are some key setups that we can enable in GP to minimize the data integrity issues between the sub ledgers and the general ledger which consumes precious reconciliation time which is spent by many resources during the key audit time.

http://cvakumar.com/msdynamics/2010/05/16/maintaining-data-integrity-between-sub-ledgers-and-general-ledger/

5/17/10

How to Hammer Out Deal Terms Like Warren Buffett

In the early stages of selling your business, you're better off taking cues from the Oracle of Omaha and keeping the lawyers away from the negotiations.

I once went to a Broadway play starring James Belushi. I had good seats, so Belushi was only a few feet in front of me. I remember how captivating it was to be that close to someone famous. I became engrossed in the show and felt as though Belushi was talking directly to me. I lost touch with time and my surroundings. I forgot about all of the work behind the scenes—the lighting, staging, music—and just sat there and enjoyed Belushi’s performance.

Selling your business should work like a good play: You should put your best, most engaging actors onstage and keep the people handling the messier details of the deal behind the curtain. In my opinion, the lawyers should remain offstage until you agree to a set of business terms with the acquirer.

It's tempting to let your lawyer be the “bad cop” in the early stages of negotiations, pounding on the table, demanding better terms. However, coming to an agreement on the core business terms before the lawyers get involved will ensure a positive foundation that will serve you well when the legal teams inevitably reach loggerheads on some of the finer details.

One of the biggest hurdles in coming to an agreement on the core business terms is drafting a simple letter that outlines the deal in language both parties understand. It's hard not to get mired in the details and document every conceivable possibility. If you get bogged down, take inspiration from Warren Buffett.

Buffett is the master of writing about business in plain English. When Buffett tries to buy a company, he approaches the CEO with a short, plain-spoken letter before any of the deal-makers get involved. Read Buffett’s annual letter to Berkshire Hathaway shareholders if you find your correspondence with a potential acquirer becoming too technical.

Here are a few things to work out with an acquirer (in plain English) before you get the lawyers involved:

  • The cash on closing (what you get the day the deal closes)
  • The earn-out (or vendor take-back) you are agreeing to, along with a clear understanding of what budget and resources you have at your disposal to meet the goals you’re signing up for
  • What happens if you decide to leave or the buyer asks you to leave before the earn-out or transition period is over
  • The working capital in the business at closing (how much money needs to be left in the company the day it changes hands)
  • Your salary and benefits when you become an employee of the acquirer
  • The degree of decision-making autonomy you’ll have as an employee during the earn-out or transition period (e.g., Do you get to decide what technology platform to use? Do you control decisions around marketing and selling? How about hiring and firing?)

If you can agree to a set of business terms, you’ll be building a good working relationship with your acquirer, which will be important once you become a division of their business. Besides, if you can’t come to an agreement on the basic business terms with a buyer, no two lawyers will be able to agree to the legal terms, which are infinitely more complex.

So follow Buffett’s lead and get the basics down in plain English before you invite the lawyers onto the stage.

http://www.inc.com/articles/2010/04/negotiating-basic-deal-terms.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+inc%2Fheadlines+%28Inc.com+Headlines%29&utm_content=My+Yahoo

5 Ways to Ensure Mediocrity in Your Organization

The recession is no excuse for ignoring, misusing, or demeaning talent. But hey, if that's what you really want to do, follow these suggestions.

The last time I checked, the U.S. led the world in productivity per employee. That's the good news. The bad news is that much, if not all, of that boost in productivity has come on the backs of workers, especially salaried types viewed by too many management teams as infinitely elastic resources. As one management consultant told me: "The average company takes better care of its copiers than it does its talent."

Many chief executives use the tough competitive environment as a handy excuse to put off salary increases, tighten the screws on performance, and generally drop any pretense of creating a human-centered workplace. But the tough-economy picture has two sides. Only those companies that make the effort to keep their employees productive by treating them decently can expect to see continued productivity gains. Much of the workforce has tuned out, waiting for a more welcoming job market to make career moves. Those organizations that haven't wavered on their commitments to flexibility, recognition of talent, and transparent leadership will keep A-list players on board as the job market improves. Their competitors may be wishing they'd paid a little more attention to employee TLC as employees start peeling off for greener pastures.

Here are five of the most insulting leadership practices, the ones that virtually guarantee a business will end up with the most self-esteem challenged, optionless team members when the dust settles.

1. If you desire a mediocre workforce, make sure your employees know you don't trust them.

Nothing spells "You're dirt to us" like a corporate culture that screams, "We don't trust you as far as we can throw you." I refer to company policies that require employees to clock in and out for lunch or software that tracks every keystroke and change of URL in case a molecule of nonwork-related activity squeaks into the workday. When employees know they're not trusted, they become experts at "presenteeism"—the physical appearance of working, without anything getting done. Congratulations! Your inability to trust the very people you've selected to join your team has cost you their energy, goodwill, and great ideas.

2. If you want to drive talented people away, don't tell them when they shine.

Fear of a high-self-esteem employee is prevalent among average-grade corporate leadership teams. Look how hard it is for so many managers to say, "Hey Bob, you did a great job today." Maybe it's a fear that the bit of praise will be met with a request for a pay raise. Maybe it's the fear that acknowledging performance will somehow make the manager look weak. Whatever the reason for silence, leaders who can't say, "Thanks—good going!" can plan on bidding farewell to their most able team members in short order.

3. If you prefer a team of C-list players, keep employees in the dark.

Sharp knowledge workers want to know what's going on in their organizations, beyond their departmental silos. They want some visibility into the company's plans and their own career mobility. Leaders who can't stand to shine a light on their firms' goals, strategies, and systems are all but guaranteed to spend a lot of money running ads on Monster.com. Marketable top performers want a seat at the table and won't stand for being left in the dark without the information they need to do their jobs well.

4. If you value docility over ingenuity, shout it from the rooftops.

I heard from a new MBA who had joined a global manufacturer. "They told me during my first week that I need a manager's signature to organize a meeting," he recalled. "They said I'm too low-level to call a meeting on my own, because unauthorized meetings of nonmanagers are against company policy." How fearful of its employees would a leadership team have to be to forbid people to gather together to solve problems? The most desirable value creators won't stick around to be treated like children. They'll hop a bus to the first employer who tells them, "We're hiring you for your talent—now go do something brilliant."

5. If you fear an empowered workforce more than you fear the competition, squash any sign of individualism.

When you go to college, you learn about Economic Man, but in the corporate workplace we see that real people don't always act rationally. Lots of individual managers and plenty of leadership teams fear nothing more than the idea that a self-directed employee might buck authority. That's equivalent to shaking the organizational power structure to its foundation, possibly a fate worse than death. Leaders who want the most docile, sheep-like employees more than the smartest and ablest ones create systems to keep the C players on board and drive the A team out the door. They do it by instituting reams of pointless rules, upbraiding people for miniscule infractions ("What? Twenty minutes late? Sure you worked here until midnight last night, but starting time is starting time.") and generally replacing trust with fear throughout their organizations. Companies that operate in fear mode will never deliver great products and services to the marketplace. Their efforts will be hamstrung by their talent-repelling management practices.

How long will it take these enterprises to figure out they're shooting themselves in the foot? It doesn't matter—you'll be long gone by then.

http://finance.yahoo.com/career-work/article/109519/5-ways-to-ensure-mediocrity-in-your-organization?mod=career-leadership

5/12/10

SQL Server APPLY operator

If you would love two scoops of yummy SQL Server query goodness, the APPLY operator is for you.

As I've said before, rarely a day goes by when I don't say "D'oh!" in surprise (and embarrassment) at being unaware of a tool or technology that I probably should have known about, but just didn't.

The SQL Server APPLY operator is one of those gems that I should have known about 5 years ago. I really need to start buying those "What's New In SQL Server" books, and actually reading them.

Imagine this Microsoft Dynamics GP scenario: You need to query vendors and include the vendor's PRIMARY address information, as well as their REMIT TO address information. But it's not quite that simple (it never is).

You find that the client has inconsistent vendor address IDs in their database. Sometimes they just use the address ID of "REMIT", sometimes "REMIT TO" and sometimes "REMIT ADDRESS", among others. And naturally, there are some vendors who don't have a separate REMIT TO address at all. In theory, some vendors may have two or more remit addresses, so you have to handle that situation as well--in which case you only want the TOP 1 remit address record.

While trying to figure out how to write such a query, I came across this article that discussed the OUTER APPLY operator.

Think of it as an OUTER JOIN, but without needing the ON keyword and subsequent matching columns from the two tables. It's also a bit like a subquery, but with the flexibility of a JOIN.

In short, it will make your inner SQL geek salivate.

Here is an example query that retrieves basic vendor information, along with the TOP 1 primary and remit to address records, allowing me to reference the fields in my query through a derived table.

Note that I used the OUTER APPLY version, since I want to return the vendor information even if there is no primary or remit to address. If you only want to return records that have a match in your apply operation, you can use CROSS APPLY, which works like an INNER JOIN. (hence the two scoops of yummy goodness)


SELECT
RTRIM(v.VENDORID) AS VENDORID,
RTRIM(v.VENDNAME) AS VENDNAME,
RTRIM(ISNULL(vpa.VNDCNTCT, '')) AS CONTACT,
RTRIM(ISNULL(vpa.ADDRESS1, '')) AS ADDRESS1,
RTRIM(ISNULL(vpa.ADDRESS2, '')) AS ADDRESS2,
RTRIM(ISNULL(vpa.CITY, '')) AS CITY,
RTRIM(ISNULL(vpa.STATE, '')) AS STATE,
RTRIM(ISNULL(vpa.ZIPCODE, '')) AS ZIPCODE,
RTRIM(ISNULL(vra.ADDRESS1, '')) AS REMITADDRESS1,
RTRIM(ISNULL(vra.ADDRESS2, '')) AS REMITADDRESS2,
RTRIM(ISNULL(vra.CITY, '')) AS REMITCITY,
RTRIM(ISNULL(vra.STATE, '')) AS REMITSTATE,
RTRIM(ISNULL(vra.ZIPCODE, '')) AS REMITZIPCODE,
RTRIM(ISNULL(vpa.PHNUMBR1, '')) AS PHONE1,
RTRIM(ISNULL(vpa.PHNUMBR2, '')) AS PHONE2,
RTRIM(ISNULL(vpa.FAXNUMBR, '')) AS FAX
FROM PM00200 v
OUTER APPLY (SELECT TOP 1 * FROM PM00300 vpa WHERE VENDORID = v.VENDORID AND ADRSCODE LIKE '%PRIMARY%') AS vpa
OUTER APPLY (SELECT TOP 1 * FROM PM00300 vpa WHERE VENDORID = v.VENDORID AND ADRSCODE LIKE '%REMIT%') AS vra


I told you it was tasty! Bon Appétit!


Here are a few more articles on the topic:

http://msdn.microsoft.com/en-us/library/ms175156.aspx

http://www.sqlteam.com/article/using-cross-apply-in-sql-server-2005

http://decipherinfosys.wordpress.com/2007/10/08/apply-operator-in-sql-server-2005/

FROM: http://dynamicsgpland.blogspot.com/2010/05/sql-server-apply-operator.html

5/6/10

What financial data business owners neeed to know everyday

Yesterday Sales
Yesterday Cashflow
Last 30 Day Sales
Last 30 Day Cash Flow
Last 30 Day Top Selling Items
Last 365 Day Sales
Last 365 Day Cash Flow
Low Inventory Items