Hosting in bandwidth constrained situations

Hosting web sites has never been less expensive, but often there are constraints that when exceeded can dramatically increase your cost.
One of the key areas to manage is bandwidth.  I recomend to my clients that they be aggressive in allowing only  the major search engines and their targeted visitors access to their site.   Conserving and protecting bandwidth in this manner can keep you in the low cost tier and has a secondary advantage of reducing noise in traffic reports.
TIP #1 : Know why you are hosting and who are your desired visitors.  Your site likely has content that directly drives selling a product or a service, or providing information to reduce customer service functions.  Or, are you could be a media or blog client site trying to draw eye balls to ads you have sold and host.  In most cases, this means you want legit real humans who are using a browser.  You will also likely want the major search engines in your region to have unrestricted access.
TIP #2 : Read your traffic reports.  Learn who is visiting your site and what are they doing when they get there.  Is this a human or major search engine ?  Track how many bytes of network traffic went to that visitor and convince yourself that this is the type of visitor you want to commit hosting resources too.
TIP #3 : Learn what bad behavior looks like in your traffic report.  Is it a script pounding away trying to brute force login or post content?  Is it vacuuming all your blog updates for purposes unknown?  Be aware that many blogs get repackaged and reposted by sites interested only in stealing your traffic and page ranks.  If that visitor didnt download a style sheet, you can be assured it is not a human using a browser.  I have also seen many cases where the bad guys steals the main content and traffic, but with some simple steps has your site continueing to providing the graphics or downloads giving your no benefit what so ever.
TIP #4 : Ban those user IPs you dont want or cant understand.  There are many ways to do this, so check with your hosting provider on the options that are available.
If you are on your own server, the mechanism I like best is to configure apache to rotate logs every 60 seconds.  When a new log set becomes available, process it to identify who has visited.  If the traffic and use pattern matches your desired visitor - great!  If the traffic pattern exceeds your threshold for inappropriateness or it doesnt look like a human (for example, no style sheets downloaded), then immediately have your processing script put in a null route for that visitor:
/sbin/route add -host <visitor ip> gw
While these routes will be cleared the next time the server is rebooted, that is probably a good thing to not permanently block an IP.  By blocking a bad visitor within a couple of minutes of detection, you will preserve bandwidth thereby improving response time for your targeted visitors.  And, your traffic reports will be much more useful when all the distortion is taken out.
If you need help or consulting assistance to implement these and other mechanisms (there are many ways to do this! ) to get the best use of your hosting resources, please do Contact Me for further information.