Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 8e73006

Browse files
authored
Translate library/urllib.robotparser.po (#208)
1 parent 3e75b1b commit 8e73006

File tree

1 file changed

+32
-7
lines changed

1 file changed

+32
-7
lines changed

library/urllib.robotparser.po

Lines changed: 32 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -3,24 +3,27 @@
33
# This file is distributed under the same license as the Python package.
44
#
55
# Translators:
6+
# Adrian Liaw <[email protected]>, 2018
7+
# Phil Lin <[email protected]>, 2022
68
msgid ""
79
msgstr ""
810
"Project-Id-Version: Python 3.10\n"
911
"Report-Msgid-Bugs-To: \n"
1012
"POT-Creation-Date: 2021-10-26 16:47+0000\n"
11-
"PO-Revision-Date: 2018-05-23 16:14+0000\n"
12-
"Last-Translator: Adrian Liaw <adrianliaw2000@gmail.com>\n"
13+
"PO-Revision-Date: 2022-01-27 13:40+0800\n"
14+
"Last-Translator: Phil Lin <linooohon@gmail.com>\n"
1315
"Language-Team: Chinese - TAIWAN (https://github.com/python/python-docs-zh-"
1416
"tw)\n"
1517
"Language: zh_TW\n"
1618
"MIME-Version: 1.0\n"
1719
"Content-Type: text/plain; charset=UTF-8\n"
1820
"Content-Transfer-Encoding: 8bit\n"
1921
"Plural-Forms: nplurals=1; plural=0;\n"
22+
"X-Generator: Poedit 3.0.1\n"
2023

2124
#: ../../library/urllib.robotparser.rst:2
2225
msgid ":mod:`urllib.robotparser` --- Parser for robots.txt"
23-
msgstr ""
26+
msgstr ":mod:`urllib.robotparser` --- robots.txt 的剖析器"
2427

2528
#: ../../library/urllib.robotparser.rst:10
2629
msgid "**Source code:** :source:`Lib/urllib/robotparser.py`"
@@ -34,42 +37,52 @@ msgid ""
3437
"on the structure of :file:`robots.txt` files, see http://www.robotstxt.org/"
3538
"orig.html."
3639
msgstr ""
40+
"此模組 (module) 提供了一個單獨的類別 (class) \\ :class:`RobotFileParser`\\ ,"
41+
"它可以知道某個特定 user agent(使用者代理)是否能在有發布 :file:`robots.txt` "
42+
"文件的網站 fetch(擷取)特定 URL。有關 :file:`robots.txt` 文件結構的更多細"
43+
"節,請參閱 http://www.robotstxt.org/orig.html。"
3744

3845
#: ../../library/urllib.robotparser.rst:28
3946
msgid ""
4047
"This class provides methods to read, parse and answer questions about the :"
4148
"file:`robots.txt` file at *url*."
4249
msgstr ""
50+
"此類別提供了一些方法可以讀取、剖析和回答關於 *url* 上的 :file:`robots.txt` 文"
51+
"件的問題。"
4352

4453
#: ../../library/urllib.robotparser.rst:33
4554
msgid "Sets the URL referring to a :file:`robots.txt` file."
46-
msgstr ""
55+
msgstr "設置指向 :file:`robots.txt` 文件的 URL。"
4756

4857
#: ../../library/urllib.robotparser.rst:37
4958
msgid "Reads the :file:`robots.txt` URL and feeds it to the parser."
50-
msgstr ""
59+
msgstr "讀取 :file:`robots.txt` URL 並將其輸入到剖析器。"
5160

5261
#: ../../library/urllib.robotparser.rst:41
5362
msgid "Parses the lines argument."
54-
msgstr ""
63+
msgstr "剖析 lines 引數。"
5564

5665
#: ../../library/urllib.robotparser.rst:45
5766
msgid ""
5867
"Returns ``True`` if the *useragent* is allowed to fetch the *url* according "
5968
"to the rules contained in the parsed :file:`robots.txt` file."
6069
msgstr ""
70+
"根據從 :file:`robots.txt` 文件中剖析出的規則,如果 *useragent* 被允許 fetch "
71+
"*url* 的話,則回傳 ``True``。"
6172

6273
#: ../../library/urllib.robotparser.rst:51
6374
msgid ""
6475
"Returns the time the ``robots.txt`` file was last fetched. This is useful "
6576
"for long-running web spiders that need to check for new ``robots.txt`` files "
6677
"periodically."
6778
msgstr ""
79+
"回傳最近一次 fetch ``robots.txt`` 文件的時間。這適用於需要定期檢查 ``robots."
80+
"txt`` 文件更新情況的長時間運行網頁爬蟲。"
6881

6982
#: ../../library/urllib.robotparser.rst:57
7083
msgid ""
7184
"Sets the time the ``robots.txt`` file was last fetched to the current time."
72-
msgstr ""
85+
msgstr "將最近一次 fetch ``robots.txt`` 文件的時間設置為當前時間。"
7386

7487
#: ../../library/urllib.robotparser.rst:62
7588
msgid ""
@@ -78,6 +91,9 @@ msgid ""
7891
"apply to the *useragent* specified or the ``robots.txt`` entry for this "
7992
"parameter has invalid syntax, return ``None``."
8093
msgstr ""
94+
"針對指定的 *useragent* 從 ``robots.txt`` 回傳 ``Crawl-delay`` 參數的值。如果"
95+
"此參數不存在、不適用於指定的 *useragent* ,或是此參數在 ``robots.txt`` 中所指"
96+
"的條目含有無效語法,則回傳 ``None``。"
8197

8298
#: ../../library/urllib.robotparser.rst:71
8399
msgid ""
@@ -86,16 +102,25 @@ msgid ""
86102
"such parameter or it doesn't apply to the *useragent* specified or the "
87103
"``robots.txt`` entry for this parameter has invalid syntax, return ``None``."
88104
msgstr ""
105+
"以 :term:`named tuple` ``RequestRate(requests, seconds)`` 的形式從 ``robots."
106+
"txt`` 回傳 ``Request-rate`` 參數的內容。如果此參數不存在、不適用於指定的 "
107+
"*useragent* ,或是此參數在 ``robots.txt`` 中所指的條目含有無效語法,則回傳 "
108+
"``None``。"
89109

90110
#: ../../library/urllib.robotparser.rst:81
91111
msgid ""
92112
"Returns the contents of the ``Sitemap`` parameter from ``robots.txt`` in the "
93113
"form of a :func:`list`. If there is no such parameter or the ``robots.txt`` "
94114
"entry for this parameter has invalid syntax, return ``None``."
95115
msgstr ""
116+
"以 :func:`list` 的形式從 ``robots.txt`` 回傳 ``Sitemap`` 參數的內容。如果此參"
117+
"數不存在或此參數在 ``robots.txt`` 中所指的條目含有無效語法,則回傳 ``None``。"
96118

97119
#: ../../library/urllib.robotparser.rst:89
98120
msgid ""
99121
"The following example demonstrates basic use of the :class:`RobotFileParser` "
100122
"class::"
101123
msgstr ""
124+
"下面的範例展示了 :class:`RobotFileParser` 類別的基本用法:\n"
125+
"\n"
126+
"::"

0 commit comments

Comments
 (0)