if i were really smart i'd set up SELinux and use sandbox -X to create a secure sandbox for my Firefox and other networked applications. i still want to get around to this one day, but it seems too complicated for me to learn how to write all the rules necessary in 30 minutes.
instead i use a combination of tunneling and secure filtering to lock down my browsing session. first of all, all my traffic goes through an ssh SOCKS tunnel to a VPS i pay for (these can be as little as $4 with a 100GB or more bandwidth cap, so more than enough for general browsing needs). this immediately solves the "starbucks sniffer" problem, and thus my only worry left is the traffic from my VPS to the websites i'm connecting to. this works everywhere i have internet access, using my server-side HTTP-to-SSH proxy (example here and here) and proxytunnel (i need to check out corkscrew though).
for those connections and the content delivered from them i have an assortment of Firefox plugins. my current list of Firefox plugins are: Adblock Plus, Cert Viewer Plus, Certificate Patrol, Expiry Canary, facebooksecurelogin, Force-TLS, HTTPS-Everywhere, NoScript, Safe, SSL Blacklist, and WOT.
the end result? i have a lot more insight as to what is going on behind the scenes as i browse the web. every time an old SSL certificate is replaced with a new one, i get a notification with a diff of the changes. when a site's certificate is about to expire i am notified, thus i'll have advance warning if a site could be potentially exploited or unavailable in the future. all connections to frequently-visited sites such as Wikipedia, Facebook, Google, PayPal, Twitter are forced to use SSL. if i connect to an HTTPS page, the border around my browser window is changed to green, certifying the whole page indeed is using SSL. if there is an element in the page which does not use HTTPS, the border is red. if i submit a form at any time and it doesn't go to an HTTPS url, i am warned before i can press submit. if any certificate uses MD5, i am warned. and when browsing google and other websites i am warned if the site has a low or bad rating, has been reported as a malware site, etc (and it's usually right on the money). of course with NoScript any site i don't explicitly trust can't load any potentially-malicious JavaScript, XSS attacks are prevented, and i can even force all cookies and javascript to use SSL to prevent interception or injection (ala Firesheep).
with all these protections i have much more visibility into whether a site i'm on could potentially have malicious content, and my interactions with these and other sites are inherently more secure. of course most of these plugins are only effective on the most popular sites by default, since complex rules often have to be written to allow specific requests to prevent complicated attacks. but at least we're starting to get more secure by default instead of less.