OK, so there is a common myth about performance and good design being contradictory. It is said that we must often sacrifice good design for the sake of performance. Maybe so... but not always.
Good design is about proper decomposition of functional requirements, about proper modularization, non repitition, simplicity, reusability, proper use of inheritance hierarchies, and about containing the functionality.
There is no reason for most of the above to affect performance. However, sometimes we create good design by adding layers and indirection. This may result in extra method calls and perhaps creation of a few more classes. This may somewhat affect performance of the application, but again if the application in question is a web based one, then the performance hit will be negligible compared to the time taken by the web request and database query. However, if we are making software for an embedded device or an application with real time capabilities, then the extra method calls and object creations might hurt. However it is difficult to conclude either way without running a profiler.
So I'd say... first create a good design and make it work. If performance is an issue, run a profiler on the application and fine tune those portions that are the biggest bottlenecks. If in the process, some design has to be sacrificed, so be it. But sacrifice design only and only if it is absolutely necessary, and there is no other option.
Note: This post was originally posted on my blog at http://www.adaptivelearningonline.net