這個外掛並未在最新的 3 個 WordPress 主要版本上進行測試。開發者可能不再對這個外掛進行維護或提供技術支援,並可能會與更新版本的 WordPress 產生使用上的相容性問題。

DB Robots.txt

外掛說明

DB Robots.txt is an easy (i.e. automated) solution to creating and managing a robots.txt file for your site. It is easy to create a robots.txt without FTP access.

If the plugin detects an existing XML sitemap file it will be included into robots.txt file.

It automatically includes the host-rule for Yandex.

安裝方式

  1. Upload bisteinoff-robots-txt folder to the /wp-content/plugins/ directory
  2. Activate the plugin through the ‘Plugins’ menu in WordPress
  3. Enjoy

常見問題集

Will it conflict with any existing robots.txt file?

If a physical robots.txt file exists on your site, WordPress won’t process any request for one, so there will be no conflict.

Will this work for sub-folder installations of WordPress?

Out of the box, no. Because WordPress is in a sub-folder, it won’t “know” when someone is requesting the robots.txt file which must be at the root of the site.

使用者評論

閱讀全部 1 則使用者評論

參與者及開發者

以下人員參與了開源軟體〈DB Robots.txt〉的開發相關工作。

參與者

將〈DB Robots.txt〉外掛本地化為台灣繁體中文版

對開發相關資訊感興趣?

任何人均可瀏覽程式碼、查看 SVN 存放庫,或透過 RSS 訂閱開發記錄

變更記錄

2.2

  • Fixed Sitemap option

2.1

  • Tested with WordPress 5.5.
  • Added wp-sitemap.xml

2.0

  • Tested with WordPress 5.0.
  • The old Host directive is removed, as no longer supported by Yandex.
  • The robots directives are improved and updated.
  • Added the robots directives, preventing indexind duplicate links with UTM, Openstat, From, GCLID, YCLID, YMCLID links

1.0

  • Initial release.