The server can set Cache-Control as far as it wants but if the client browser (Firefox, IE, etc.) says something else, the client rules. For example: an image with proper server’s Cache-control and Expires headers. If the client says “Cache-control: max-age=0” in its request, a conditional GET request will send and HTTP 304 code would return from the server. The expected and best behavior would be no request at all.
It’s not the server’s fault, it’s done its part correctly. The client is now in control and it seems Firefox and IE on my machine don’t like caching that much. So 304’s instead of zero request for now.
Update: Firefox actually is pretty smart on this. If you click F5, Refresh or a direct reload, it will send cache-control: max-age=0. It means the server will return 304’s. But if you move from one page to the next and the page reuse other cached component, Firefox won’t make any request at all, which is the expected behavior. Now I can sleep tight knowing that our pages are faster as fewer conditional GET requests are made.
Writing a persuasive message needs 4 components:
- Attention = What? You must get audience attention
- Interest = Okay, then what? You continue to build their interest in the subject
- Desire = Why? Why should they care, what is in it for them?
- Action = How? Okay, I’m sold, what’s next?
Characteristics of a good persuasive message:
- Visual – use story telling, put them into your shoe, your vision
- Personal – talk to them directly, not as a group but as an individual, one-to-one
- Creative – use fresh ideas instead of some cliche, which is boring. And this is the hard part because the longer people get exposed to media, the more likely they’ve seen it before and learn to ignore/skip/forget about it
If you need to delay execution (like sleep function), use setTimeout. If you need to repeatedly do something (like refresh stats), use setInterval.
- window.setTimeout(“show();”,2000); // wait 2 seconds then run show()
- window.setInterval(“show();”,3000); // run show() every 3 seconds
98% of Google’s revenue is from advertising so it’s not surprising that Google just released a hosted ad server solution after recent acquisition of DoubleClick. They now have 2 solutions, one developed in-house for the small/medium publishers, one from DoubleClick for larger publishers. It’ll be just a matter of time before Yahoo or Microsoft either acquire or release their own version of ad servers. Since they’re all big companies, they cannot offer an independent ad serving solution. They must tie in with their own ad network(s) or there would be no revenue source. Publishers probably need to share data, lose some control in exchange for the free ad serving benefits. Additionally, it would be also difficult for them to offer certain complex features like real-time stats because it’s a resource hog and would raise their costs per user significantly.
Google Ad Manager’s TOS
“You agree that Google may aggregate Program Data with data collected from other Program users, and use such aggregated data, provided that Google will only aggregate data in a manner such that no third party could identify which users’ data contributed to the aggregated set.”
We are flattered to know that our ad serving solution, AdSpeed, got several copycats. It seems like requests to clone certain site are quite normal these days and you can find them easily on the job/project boards for freelancers. Examples: Digg, Youtube all have copycats and that’s simply not the end of the world. Now even big guys are copying features from each other (Digg vs. Yahoo Buzz, Yahoo Publisher vs. Google AdWords, etc.) and we’re no exception, learn the best practices, avoid the bad moves. It’s a natural learning process (think “baby”). However, it’s definitely distinctive from doing exactly the same, without any adaptation or credit, like plagiarism.
I just found out this hosted ad server, they copy our processes, even some content with some modifications. For now, it seems they’re not forming a serious business and we have not considered taking any legal action yet. Additionally, there is a significant complexity of the front-end and a huge amount of knowledge, experience in the back-end infrastructure and our application platform that is very hard, if not impossible, to replicate.