Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Translate library/urllib.robotparser.po #208

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jan 27, 2022
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 29 additions & 8 deletions library/urllib.robotparser.po
Original file line number Diff line number Diff line change
Expand Up @@ -3,24 +3,27 @@
# This file is distributed under the same license as the Python package.
#
# Translators:
# Adrian Liaw <[email protected]>, 2018
# Phil Lin <[email protected]>, 2022
msgid ""
msgstr ""
"Project-Id-Version: Python 3.10\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2021-10-26 16:47+0000\n"
"PO-Revision-Date: 2018-05-23 16:14+0000\n"
"Last-Translator: Adrian Liaw <adrianliaw2000@gmail.com>\n"
"PO-Revision-Date: 2022-01-23 14:36+0800\n"
"Last-Translator: Phil Lin <linooohon@gmail.com>\n"
"Language-Team: Chinese - TAIWAN (https://github.com/python/python-docs-zh-"
"tw)\n"
"Language: zh_TW\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"Plural-Forms: nplurals=1; plural=0;\n"
"X-Generator: Poedit 3.0.1\n"

#: ../../library/urllib.robotparser.rst:2
msgid ":mod:`urllib.robotparser` --- Parser for robots.txt"
msgstr ""
msgstr ":mod:`urllib.robotparser` --- robots.txt 的解析器"

#: ../../library/urllib.robotparser.rst:10
msgid "**Source code:** :source:`Lib/urllib/robotparser.py`"
Expand All @@ -34,42 +37,52 @@ msgid ""
"on the structure of :file:`robots.txt` files, see http://www.robotstxt.org/"
"orig.html."
msgstr ""
"此模組提供了一個單獨的類別 :class:`RobotFileParser`, 它可以知道某個特定 user "
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • 應使用中式標點符號。
  • 根據術語表,moduleclass 建議要在一份文件中首次出現的地方同時附上原文和譯文,有兩種形式可以選擇:如果首次出現的地方是翻譯成 module(模組),則如果該文件後面再出現該詞,譯文中就可以直接放上原文 module;如果首次出現的地方是 模組 (module) 則後面譯文直接翻 模組 就好。
    除了 moduleclass,也適用此規則的詞還有 method
Suggested change
"此模組提供了一個單獨的類別 :class:`RobotFileParser`, 它可以知道某個特定 user "
"此 module(模組)提供了一個單獨的 class(類別)\\ :class:`RobotFileParser`\\ ,它可以知道某個特定 user "

"agent (使用者代理)是否能在有發布 :file:`robots.txt` 文件的網站 fetch(擷"
"取)此網站特定的 URL。有關 :file:`robots.txt` 文件結構的更多細節,請參閱 "
"http://www.robotstxt.org/orig.html 。"

#: ../../library/urllib.robotparser.rst:28
msgid ""
"This class provides methods to read, parse and answer questions about the :"
"file:`robots.txt` file at *url*."
msgstr ""
"此類別提供了一些方法可以讀取、解析和回答關於 *url* 上的 :file:`robots.txt` 文"
"件的問題。"

#: ../../library/urllib.robotparser.rst:33
msgid "Sets the URL referring to a :file:`robots.txt` file."
msgstr ""
msgstr "設置指向 :file:`robots.txt` 文件的 URL。"

#: ../../library/urllib.robotparser.rst:37
msgid "Reads the :file:`robots.txt` URL and feeds it to the parser."
msgstr ""
msgstr "讀取 :file:`robots.txt` URL 並將其輸入到解析器。"

#: ../../library/urllib.robotparser.rst:41
msgid "Parses the lines argument."
msgstr ""
msgstr "解析行參數(此參數為 ``robots.txt`` 文件裡的行)。"

#: ../../library/urllib.robotparser.rst:45
msgid ""
"Returns ``True`` if the *useragent* is allowed to fetch the *url* according "
"to the rules contained in the parsed :file:`robots.txt` file."
msgstr ""
"如果根據被解析的 :file:`robots.txt` 文件中的規則,*useragent* 被允許 fetch "
"*url* 的話,則返回 ``True``。"

#: ../../library/urllib.robotparser.rst:51
msgid ""
"Returns the time the ``robots.txt`` file was last fetched. This is useful "
"for long-running web spiders that need to check for new ``robots.txt`` files "
"periodically."
msgstr ""
"返回最近一次 fetch ``robots.txt`` 文件的時間。這適用於需要定期檢查 ``robots."
"txt`` 文件更新情況的長時間運行網頁爬蟲。"

#: ../../library/urllib.robotparser.rst:57
msgid ""
"Sets the time the ``robots.txt`` file was last fetched to the current time."
msgstr ""
msgstr "將最後一次獲取 ``robots.txt`` 文件的時間設置為當前時間。"

#: ../../library/urllib.robotparser.rst:62
msgid ""
Expand All @@ -78,6 +91,9 @@ msgid ""
"apply to the *useragent* specified or the ``robots.txt`` entry for this "
"parameter has invalid syntax, return ``None``."
msgstr ""
"針對指定的 *useragent* 從 ``robots.txt`` 返回 ``Crawl-delay`` 參數的值。如果"
"此參數不存在或不適用於指定的 *useragent* ,或是此參數在 ``robots.txt`` 存在語"
"法錯誤,則返回 ``None``。"

#: ../../library/urllib.robotparser.rst:71
msgid ""
Expand All @@ -86,16 +102,21 @@ msgid ""
"such parameter or it doesn't apply to the *useragent* specified or the "
"``robots.txt`` entry for this parameter has invalid syntax, return ``None``."
msgstr ""
"以 :term:`named tuple` ``RequestRate(requests, seconds)`` 的形式從 ``robots."
"txt`` 返回 ``Request-rate`` 參數的內容。如果此參數不存在或不適用於指定的 "
"*useragent* ,或是此參數在 ``robots.txt`` 存在語法錯誤,則返回 ``None``。"

#: ../../library/urllib.robotparser.rst:81
msgid ""
"Returns the contents of the ``Sitemap`` parameter from ``robots.txt`` in the "
"form of a :func:`list`. If there is no such parameter or the ``robots.txt`` "
"entry for this parameter has invalid syntax, return ``None``."
msgstr ""
"以 :func:`list` 的形式從 ``robots.txt`` 返回 ``Sitemap`` 參數的內容。如果此參"
"數不存在或此參數在 ``robots.txt`` 存在語法錯誤,則返回 ``None``。"

#: ../../library/urllib.robotparser.rst:89
msgid ""
"The following example demonstrates basic use of the :class:`RobotFileParser` "
"class::"
msgstr ""
msgstr "下面的範例展示了 :class:`RobotFileParser` 類別的基本用法::"