15:52 Webinar: Google Analytics' Newest Features » Google Analytics Blog
As of this week, all the new Google Analytics features we recently announced should be available in all accounts! (And just yesterday, we announced one more - a new, asynchronous tracking code snippet.)

If you missed the announcements or are curious about the features you're now seeing, join us in this upcoming webinar, happening next week on Wednesday. We'll provide an overview and demonstration of the features and provide tips on some best practices and uses. You'll learn how the following features have added more power, flexibility and intelligence to Google Analytics' enterprise class capabilities:
  • Engagement Goals
  • Expanded Mobile Reporting
  • Advanced Table Filtering
  • Unique Visitor Metric
  • Multiple Custom Variables
  • Sharing Advanced Segments & Custom Reports
  • Analytics Intelligence
  • Custom Alerts
When: Wednesday, December 9, 2009
Time: 10 - 11 am, PST

Register here.

There will also be an opportunity for Q&A so please ask your questions beforehand through Google Moderator.

We hope you'll come learn more about the latest features .... and we may even have a few extra surprises to share then, too! Hope to see you there.

14:04 How fast is your site? » Google Webmaster Central Blog
We've just launched Site Performance, an experimental feature in Webmaster Tools that shows you information about the speed of your site and suggestions for making it faster.

This is a small step in our larger effort to make the web faster. Studies have repeatedly shown that speeding up your site leads to increased user retention and activity, higher revenue and lower costs. Towards the goal of making every webpage load as fast as flipping the pages of a magazine, we have provided articles on best practices, active discussion forums and many tools to diagnose and fix speed issues.

Now we bring data and statistics specifically applicable to your site. On Site Performance, you'll find how fast your pages load, how they've fared over time, how your site's load time compares to that of other sites, examples of specific pages and their actual page load times, and Page Speed suggestions that can help reduce user-perceived latency. Our goal is to bring you specific and actionable speed information backed by data, so stay tuned for more of this in the future.

screenshot of Site Performance

The load time data is derived from aggregated information sent by users of your site who have installed the Google Toolbar and opted-in to its enhanced features. We only show the performance charts and tables when there's enough data, so not all of them may be shown if your site has little traffic. The data currently represents a global average; a specific user may experience your site faster or slower than the average depending on their location and network conditions.

This is a Labs product that is still in development. We hope you find it useful. Please let us know your feedback through the Webmaster Tools Forum.

08:02 New User Agent for News » Google Webmaster Central Blog
Webmaster Level: Intermediate

Today we are announcing a new user agent for robots.txt called Googlebot-News that gives publishers even more control over their content. In case you haven't heard of robots.txt, it's a web-wide standard that has been in use since 1994 and which has support from all major search engines and well-behaved "robots" that process the web. When a search engine checks whether it has permission to crawl and index a web page, the "check if we're allowed to crawl this page" mechanism is robots.txt.

Publishers could easily contact us via a form if they didn't want to be included in Google News but did want to be in Google's web search index. Now, publishers can manage their content in Google News in an even more automated way. Site owners can just add Googlebot-News specific directives to their robots.txt file. Similar to the Googlebot and Googlebot-Image user agents, the new Googlebot-News user agent can be used to specify which pages of a website should be crawled and ultimately appear in Google News.

Here are a few examples for publishers:

Include pages in both Google web search and News:
User-agent: Googlebot
Disallow:

This is the easiest case. In fact, a robots.txt file is not even required for this case.

Include pages in Google web search, but not in News:
User-agent: Googlebot
Disallow:

User-agent: Googlebot-News
Disallow: /

This robots.txt file says that no files are disallowed from Google's general web crawler, called Googlebot, but the user agent "Googlebot-News" is blocked from all files on the website.

Include pages in Google News, but not Google web search:
User-agent: Googlebot
Disallow: /

User-agent: Googlebot-News
Disallow:

When parsing a robots.txt file, Google obeys the most specific directive. The first two lines tell us that Googlebot (the user agent for Google's web index) is blocked from crawling any pages from the site. The next directive, which applies to the more specific user agent for Google News, overrides the blocking of Googlebot and gives permission for Google News to crawl pages from the website.

Block different sets of pages from Google web search and Google News:
User-agent: Googlebot
Disallow: /latest_news

User-agent: Googlebot-News
Disallow: /archives

The pages blocked from Google web search and Google News can be controlled independently. This robots.txt file blocks recent news articles (URLs in the /latest_news folder) from Google web search, but allows them to appear on Google News. Conversely, it blocks premium content (URLs in the /archives folder) from Google News, but allows them to appear in Google web search.

Stop Google web search and Google News from crawling pages:
User-agent: Googlebot
Disallow: /

This robots.txt file tells Google that Googlebot, the user agent for our web search crawler, should not crawl any pages from the site. Because no specific directive for Googlebot-News is given, our News search will abide by the general guidance for Googlebot and will not crawl pages for Google News.

For some queries, we display results from Google News in a discrete box or section on the web search results page, along with our regular web search results. We sometimes do this for Images, Videos, Maps, and Products, too. This is known as Universal search results. Since Google News powers Universal "News" search results, if you block the Googlebot-News user agent then your site's news stories won't be included in Universal search results.

We are currently testing our support for the new user agent. If you see any problems please let us know. Note that it is possible for Google to return a link to a page in some situations even when we didn't crawl that page. If you'd like to read more about robots.txt, we provide additional documentation on our website. We hope webmasters will enjoy the flexibility and easier management that the Googlebot-News user agent provides.

《Joel谈软件》出版了阮一峰的网络日志 » 车东's shared items in Google Reader

上周五,我突然发高烧。

去看急症。医生说,好消息是你的病属于呼吸道细菌感染,所以不是H1N1;坏消息是感染得很严重,很可能转成肺炎。于是,我卧床休息,一直到昨天。

就在我生病的这几天里,世界发生了剧烈的变化——Mininova关门了!它把所有没有版权的torrent文件都撤了。奥巴马访华与否,世界不会有任何变化;但是,Mininova没了,上海的天空顿时就黯淡了。一个老外说:“My heart is broken into a million pieces.”(我的心碎成了一百万片。)我也有同感,从今以后,BT下载就没那么方便了。我开始认真思考,也许usenet的商业机会来了。

==========================

好了,下面要谈今天的主题了。

我翻译的More Joel on Software出版了。

书名

中文版的书名叫《软件随想录》,还有一个副标题《程序员部落酋长 Joel谈软件》。

不要问我为什么起这个名字,因为我也不知道。这是出版方的决定,可能是希望多争取一些非专业读者吧。

此外,由于出版方有最终的决定权,所以一些句子的译法和某些篇目的名称都做了改动,我对此也无能为力。比如,《Java语言学校的危险性》一文,现在的名字是《学校只教Java的危险性》,智者见智吧。

编辑

八月底,我交稿。十一月底,书稿送印刷厂。整个编辑过程只用了三个月,算是很高效了。

期间,多位编辑一字一句地多遍校对,其认真程度让我咂舌。所以,我可以向读者保证,这本书的错误字和病句应该是寥寥无几的。

书价

定价是49元。

我觉得挺贵的,比我预想的价格贵了不少。但是,网上书店可以打七五折,因此一般来说,3x元就可以拿下了。

稿酬

我还没拿到稿酬,应该在一万元左右,很可能不到一万元。全书18万字,所以算一下就知道,我拿的是翻译的最低价。想当年读研究生期间,我还是一个翻译菜鸟,拿到的稿酬标准都要比这次高许多。但是,这个倒没关系,因为我从一开始翻译这本书就不是为了钱。

真正令我痛不欲生的是,我万万没有料到,这本书的翻译拖了九个月!平均每天译六百多字!到了后来,我都快烦死了,不知什么时候是个头。总之,以后再也不敢干这种自讨苦吃的傻事了。

另外,我的稿酬属于一次性卖断,就算这本书成了畅销书,我也不会拿到更多的报酬。所以,大家不必为了支持我,而去购买这本书。

其他

一些网上书店已经提供此书的预定了。我想,二星期内肯定到货。

最后,我为此书做了一个简陋的主页:http://www.ruanyifeng.com/mjos

=========================

中文版序言

欢迎阅读《软件随想录》的中文版。

很确凿的一点是,全世界的贸易壁垒都在消退,但是各国软件业之间的隔膜,却仍然大得惊人。尤其是考虑到,我们中的大多数人,都在使用同样的工具和技术,比如Unix、互联网、C#、Windows、面向对象编程等等。一个中国程序员用来解决问题的工具,基本上与世界其他地方的程序员使用的工具是一样的。

因此,我很高兴,我的一些疯狂的想法能够被远在中国的你读到。这要归功于此书的出版单位,以及中文版的译者和编辑,由于他们的辛勤劳动,我们之间的语言隔阂才得以消除。

你可以把这本书送给你的老板,但是我向你保证,这真的是一个坏主意。因为许多国家的老板都在怒气冲冲地讨伐我,我一天到晚都收到这样的电子邮件,我可不想再收到更多了。另一方面,这本书是纸制的,用起来比网站方便多了,你随时可以把里面的文字撕下来,用来垫鸟笼,或者用来包裹活鱼。我向你保证,这是你能找到的最有成效地使用这本书的方法了。如果你一拿到书就这样做,你就不会被我的任何一句胡说八道影响到了。

希望你读得愉快!

Joel Spolsky于美国纽约

2009年11月

========================

译者序

2008年底,当我拿到300页的More Joel on Software时,并未料到,翻译此书竟然需要超过9个月的时间。我生活中的每件事,都因为它而延后了。打字的疲惫、尽快交稿的压力、单调工作引发的烦躁以及苦思冥想依然不解其义的愤懑,都不必提了。如果早知道要过9个月这样的日子,我不会答应翻译这本书。

不过,这确实是一本好书,一定会广为流传,许多年后还有人阅读。所以不管怎样,我可能还是会答应翻译它。因为有时你会头脑发热,希望能够参与到某种不平凡的事件之中,希望自己的名字和这样的东西联系在一起……谁知道呢。

在翻译过程中,我得到了很多帮助,在此表示感谢。

感谢出版社的责任编辑,在我一再延迟交稿的情况下,给予我的宽容。

感谢台湾地区的译者。他们无私地将Joel的许多文章译成中文,放上了网络(local.joelonsoftware.com)。我参考了他们的译文,并从中获得了启发。

感谢给我留言指出翻译错误的网友。他们每个人的留言,都保存在我的网志上。

感谢Google字典维基百科。没有这两个工具网站,我的译稿不可能是现在的样子,许多地方我永远也不会看懂。我认为Google字典是目前最强大的电子词典,而维基百科则是我能想到的人世间最美好的东西。

最后,感谢你的阅读,但愿你喜欢我的翻译。

阮一峰

2009年8月20日 写于上海

(完)


^==Back Home: www.chedong.com

^==Back Digest Home: www.chedong.com/digest/

<== 2009-12-01
  十二月 2009  
  1 2 3 4 5 6
7 8 9 10 11 12 13
14 15 16 17 18 19 20
21 22 23 24 25 26 27
28 29 30 31      
==> 2009-12-03