3
3
# This file is distributed under the same license as the Python package.
4
4
#
5
5
# Translators:
6
+ # Adrian Liaw <[email protected] >, 2018
7
+ # Phil Lin <[email protected] >, 2022
6
8
msgid ""
7
9
msgstr ""
8
10
"Project-Id-Version : Python 3.10\n "
9
11
"Report-Msgid-Bugs-To : \n "
10
12
"POT-Creation-Date : 2021-10-26 16:47+0000\n "
11
- "PO-Revision-Date : 2018-05-23 16:14+0000 \n "
12
- "Last-Translator : Adrian Liaw <adrianliaw2000 @gmail.com>\n "
13
+ "PO-Revision-Date : 2022-01-27 13:40+0800 \n "
14
+ "Last-Translator : Phil Lin <linooohon @gmail.com>\n "
13
15
"Language-Team : Chinese - TAIWAN (https://github.com/python/python-docs-zh- "
14
16
"tw)\n "
15
17
"Language : zh_TW\n "
16
18
"MIME-Version : 1.0\n "
17
19
"Content-Type : text/plain; charset=UTF-8\n "
18
20
"Content-Transfer-Encoding : 8bit\n "
19
21
"Plural-Forms : nplurals=1; plural=0;\n "
22
+ "X-Generator : Poedit 3.0.1\n "
20
23
21
24
#: ../../library/urllib.robotparser.rst:2
22
25
msgid ":mod:`urllib.robotparser` --- Parser for robots.txt"
23
- msgstr ""
26
+ msgstr ":mod:`urllib.robotparser` --- robots.txt 的剖析器 "
24
27
25
28
#: ../../library/urllib.robotparser.rst:10
26
29
msgid "**Source code:** :source:`Lib/urllib/robotparser.py`"
@@ -34,42 +37,52 @@ msgid ""
34
37
"on the structure of :file:`robots.txt` files, see http://www.robotstxt.org/"
35
38
"orig.html."
36
39
msgstr ""
40
+ "此模組 (module) 提供了一個單獨的類別 (class) \\ :class:`RobotFileParser`\\ ,"
41
+ "它可以知道某個特定 user agent(使用者代理)是否能在有發布 :file:`robots.txt` "
42
+ "文件的網站 fetch(擷取)特定 URL。有關 :file:`robots.txt` 文件結構的更多細"
43
+ "節,請參閱 http://www.robotstxt.org/orig.html。"
37
44
38
45
#: ../../library/urllib.robotparser.rst:28
39
46
msgid ""
40
47
"This class provides methods to read, parse and answer questions about the :"
41
48
"file:`robots.txt` file at *url*."
42
49
msgstr ""
50
+ "此類別提供了一些方法可以讀取、剖析和回答關於 *url* 上的 :file:`robots.txt` 文"
51
+ "件的問題。"
43
52
44
53
#: ../../library/urllib.robotparser.rst:33
45
54
msgid "Sets the URL referring to a :file:`robots.txt` file."
46
- msgstr ""
55
+ msgstr "設置指向 :file:`robots.txt` 文件的 URL。 "
47
56
48
57
#: ../../library/urllib.robotparser.rst:37
49
58
msgid "Reads the :file:`robots.txt` URL and feeds it to the parser."
50
- msgstr ""
59
+ msgstr "讀取 :file:`robots.txt` URL 並將其輸入到剖析器。 "
51
60
52
61
#: ../../library/urllib.robotparser.rst:41
53
62
msgid "Parses the lines argument."
54
- msgstr ""
63
+ msgstr "剖析 lines 引數。 "
55
64
56
65
#: ../../library/urllib.robotparser.rst:45
57
66
msgid ""
58
67
"Returns ``True`` if the *useragent* is allowed to fetch the *url* according "
59
68
"to the rules contained in the parsed :file:`robots.txt` file."
60
69
msgstr ""
70
+ "根據從 :file:`robots.txt` 文件中剖析出的規則,如果 *useragent* 被允許 fetch "
71
+ "*url* 的話,則回傳 ``True``。"
61
72
62
73
#: ../../library/urllib.robotparser.rst:51
63
74
msgid ""
64
75
"Returns the time the ``robots.txt`` file was last fetched. This is useful "
65
76
"for long-running web spiders that need to check for new ``robots.txt`` files "
66
77
"periodically."
67
78
msgstr ""
79
+ "回傳最近一次 fetch ``robots.txt`` 文件的時間。這適用於需要定期檢查 ``robots."
80
+ "txt`` 文件更新情況的長時間運行網頁爬蟲。"
68
81
69
82
#: ../../library/urllib.robotparser.rst:57
70
83
msgid ""
71
84
"Sets the time the ``robots.txt`` file was last fetched to the current time."
72
- msgstr ""
85
+ msgstr "將最近一次 fetch ``robots.txt`` 文件的時間設置為當前時間。 "
73
86
74
87
#: ../../library/urllib.robotparser.rst:62
75
88
msgid ""
@@ -78,6 +91,9 @@ msgid ""
78
91
"apply to the *useragent* specified or the ``robots.txt`` entry for this "
79
92
"parameter has invalid syntax, return ``None``."
80
93
msgstr ""
94
+ "針對指定的 *useragent* 從 ``robots.txt`` 回傳 ``Crawl-delay`` 參數的值。如果"
95
+ "此參數不存在、不適用於指定的 *useragent* ,或是此參數在 ``robots.txt`` 中所指"
96
+ "的條目含有無效語法,則回傳 ``None``。"
81
97
82
98
#: ../../library/urllib.robotparser.rst:71
83
99
msgid ""
@@ -86,16 +102,25 @@ msgid ""
86
102
"such parameter or it doesn't apply to the *useragent* specified or the "
87
103
"``robots.txt`` entry for this parameter has invalid syntax, return ``None``."
88
104
msgstr ""
105
+ "以 :term:`named tuple` ``RequestRate(requests, seconds)`` 的形式從 ``robots."
106
+ "txt`` 回傳 ``Request-rate`` 參數的內容。如果此參數不存在、不適用於指定的 "
107
+ "*useragent* ,或是此參數在 ``robots.txt`` 中所指的條目含有無效語法,則回傳 "
108
+ "``None``。"
89
109
90
110
#: ../../library/urllib.robotparser.rst:81
91
111
msgid ""
92
112
"Returns the contents of the ``Sitemap`` parameter from ``robots.txt`` in the "
93
113
"form of a :func:`list`. If there is no such parameter or the ``robots.txt`` "
94
114
"entry for this parameter has invalid syntax, return ``None``."
95
115
msgstr ""
116
+ "以 :func:`list` 的形式從 ``robots.txt`` 回傳 ``Sitemap`` 參數的內容。如果此參"
117
+ "數不存在或此參數在 ``robots.txt`` 中所指的條目含有無效語法,則回傳 ``None``。"
96
118
97
119
#: ../../library/urllib.robotparser.rst:89
98
120
msgid ""
99
121
"The following example demonstrates basic use of the :class:`RobotFileParser` "
100
122
"class::"
101
123
msgstr ""
124
+ "下面的範例展示了 :class:`RobotFileParser` 類別的基本用法:\n"
125
+ "\n"
126
+ "::"
0 commit comments